When Were Credit Scores Invented? A Brief Look At History.
Your credit score is, for better or worse, one of the more important numbers in your life.
If you've ever borrowed money, you likely have at least one credit file with the major credit bureaus (Experian, Equifax, TransUnion). Credit cards, student loans, and auto loans appear on credit files.
It's estimated that more than 200 million Americans have credit records at one of the big three credit reporting agencies, according to the Consumer Financial Protection Bureau.
A credit score is a three-digit number calculated by a credit scoring system intended to help lenders analyze a borrower's risk. Many lenders use this number to determine your credit risk. It helps them determine how likely you are to repay a loan, and it helps set the loan's interest rate.
So, how did this single number come about?
The earliest years of credit reporting and credit bureaus
The concept of credit has existed for thousands of years, dating back to ancient civilizations that engaged in lending and borrowing. However, it was in the 1800s that the first credit bureaus began to appear in the United States. These early credit bureaus were local entities that collected information on consumers and businesses to help lenders determine creditworthiness. Initially, these bureaus relied on subjective assessments and personal opinions of individuals' creditworthiness rather than a standardized scoring system.
"The modern credit economy wouldn't have been possible without these credit bureaus," says Josh Lauer, an associate professor at the University of New Hampshire who studies consumer and financial culture. "Credit scores and credit reports help people who don't know each other make calculations to determine who they can trust."
Businesses started the nation's first consumer credit bureau. Credit bureau agents collected credit and personal information from landlords and employers about individuals (e.g., debts, moral character, etc.). They would also collect newspaper clippings and public records from courthouses.
The information collected in these files was primarily qualitative. So, in the 1930s, department stores decided to move in a more quantitative direction - based on this information, they started to assign points to individuals to gauge their creditworthiness. But this credit scoring system was anything but research-based. Individuals would earn points based on characteristics such as race, income, neighborhood, and employment status to determine someone's creditworthiness.
The start of consumer credit reporting in the early 20th century
The rise of consumerism in the early 1900s can be attributed to several factors, including mass production, urbanization, and increasing disposable income. As the American economy shifted from agricultural to industrial, new products and services flooded the market, and consumers had access to various goods. With this increased availability of products, the desire for consumer credit began to snowball.
Retailers and finance companies were quick to capitalize on this demand for credit by offering various forms of consumer financing. Installment plans, store credit, and personal loans became popular methods for consumers to purchase goods and services they could not afford to pay for upfront. However, as the use of consumer credit expanded, so did the need for a more organized and efficient system for assessing creditworthiness.
During this time, local credit bureaus needed help to keep up with the growing volume of credit information. There was a clear need for larger, more centralized organizations that could handle the increasing amount of data. This demand led to the formation of more significant credit bureaus capable of processing and managing higher volumes of credit information.
In 1899, the Retail Credit Company (RCC), the precursor to Equifax, was founded in Atlanta, Georgia. The RCC was created to provide retailers and finance companies with comprehensive, accurate, and up-to-date consumer credit information. By centralizing credit information, the RCC was able to offer a more organized and efficient means of assessing creditworthiness. This centralization allowed for better risk management by lenders and more informed decision-making when extending credit to consumers.
The RCC collected information on consumers through a network of correspondents, who provided data on individuals' financial habits and reputations within their communities. This data was then compiled into credit reports, which were sold to lenders and other businesses that required insight into the creditworthiness of potential customers.
Establishing the RCC marked a significant step in developing the credit reporting industry. Its success laid the foundation for creating other large credit bureaus, such as Experian and TransUnion, which would eventually emerge in the following decades. As consumer credit continued to grow and evolve, so did the credit reporting industry, finally giving rise to the modern credit scoring models and consumer protections we know today.
The start of the credit score in the late 20th century
In 1956, engineer Bill Fair teamed up with mathematician Earl Isaac to create Fair, Isaac, and Company to create a standardized, objective credit scoring system. In theory, a standardized rubric would eliminate the prejudice inherent in the credit evaluation and lending practices used for many years. Today, Fair, Isaac, and Company uses a different name: FICO.
In the 1950s, the credit industry resisted adapting to the new, standardized method. Only one company, American Investments, took up Fair Isaac's system when it began selling its statistical scorecard in 1958.
National department store chains were early adopters of the system when it debuted in the late 1950s; credit card issuers, auto lenders, and banks soon followed. They needed a dependable, efficient, and quick way to gauge a borrower's creditworthiness, and the Fair Isaac system provided this for them.
By the end of the 1970s, most lenders were using credit scoring. As credit reporting became more widespread, concerns about the accuracy of credit reports and the rights of consumers emerged. This led to the introduction of several key pieces of legislation aimed at protecting consumers and ensuring the accuracy of credit information. The Fair Credit Reporting Act (FCRA) of 1970 was the first major law to regulate credit bureaus, establishing rules for collecting and reporting credit information and providing consumers with the right to access and dispute their credit reports.
The current FICO score system premiered in 1989 and has become the industry standard. The number between 300 and 850 is determined by the following factors (by descending level of importance): payment history, amounts owed, length of credit history, credit mix, and recent credit inquiries (or new credit). Everything from bankruptcies to late payments factors into your credit score. The credit scoring system is occasionally tweaked to accommodate the evolving role of consumer credit and credit-related data.
FICO Score 9 debuted to lenders in 2014 and to consumers in 2016. It is widely used by the three major credit bureaus. In this credit scoring model, rent payments are included, medical debt isn't weighed as heavily, and collection accounts that have been paid aren't included in the calculation of your score.
Finally, the mortgage market's adoption of credit scoring added to the ubiquity of credit scoring. In the mid-1990s, Freddie Mac and Fannie Mae started pulling FICO scores for their automated mortgage underwriting system.
The use of credit scores today
Over the years, the use of credit scores expanded beyond lending decisions to include other areas, such as insurance underwriting, rental applications, and employment background checks. This growing reliance on credit scores led to the development of alternative scoring models, such as the VantageScore, which was introduced in 2006 by the three major credit bureaus (Equifax, Experian, and TransUnion).
Today, credit scores play a crucial role in the American economy, impacting access to credit and the terms and interest rates offered to borrowers. With the rise of digital technology and data analytics, credit scoring models have become more sophisticated, incorporating a more comprehensive range of factors to provide a more accurate credit risk assessment.
What are the different credit score models?
There are several credit scoring models, each with its own unique methodology and range of factors considered. Some of the most notable credit score models and their appearance in history include:
- FICO Score: Introduced in 1989 by the Fair Isaac Corporation, the FICO score remains one of the most widely used credit scoring models today. FICO scores range from 300 to 850, with higher scores indicating lower credit risk.
- VantageScore: Launched in 2006, VantageScore was developed by the three major credit bureaus as an alternative to the FICO score. VantageScore also uses a range of 300 to 850 and considers similar factors as the FICO score, although the weighting of these factors may differ.
- Experian PLUS Score: The Experian PLUS Score is a proprietary scoring model developed by Experian. It uses a range of 330 to 830 and is designed to provide consumers with a more comprehensive view of their creditworthiness.
- TransUnion CreditVision New Account Score: TransUnion developed its own scoring model called the New Account Score, which ranges from 300 to 850. This model is designed to predict the likelihood of a borrower becoming seriously delinquent on a new account within 24 months.
- Equifax Credit Score: Equifax also has its proprietary credit scoring model, which ranges from 280 to 850. The Equifax Credit Score is designed to provide a snapshot of a consumer's credit risk based on the information in their Equifax credit report.
The impact of credit scoring on consumer lending
Given that the purpose of the FICO credit score was the creation of a fairer system, was that goal achieved?
Undoubtedly, the standardized, objective approach to credit scores has given many people access to lending, but the credit scoring system could be better.
Some things you can't explain to an algorithm, such as a job loss that caused you to have difficulty paying your credit card bill.
"Because credit decisions are automated, instead of talking to an individual, you're dealing with data in a credit bureau that you don't necessarily have access to," Lauer says. "A system that seems democratic and objective can be individually unfair and sometimes cruel."
The credit invisibles and credit unscorables
Even with the rise of credit bureaus and credit scores, many people still need access to borrowing or need more information for a credit file.
It’s estimated that 26 million Americans are “credit invisible,” meaning that they don't have a credit history with one of the major credit bureaus. Another 19 million Americans are “credit unscorable,” meaning they have few or no accounts within their credit file or don’t have any recent credit history.
Even if an individual is an excellent money manager, lacking a credit score can impact the cost of borrowing or even shut them out of borrowing altogether.
A FICO score is a more impartial way to handle credit approval than just having a credit manager make subjective judgments about potential applicants. But algorithms can make mistakes and show discriminatory biases. For example, some algorithms reinforce racial disparities. While the scores may not be discriminative, credit inequalities have been shown to further the racial wealth gap.
"On average, black Americans have lower credit scores than white Americans, which is evidence of what is embedded in society is reflected in the data," Lauer says.
According to a report by The Urban Institute, FICO scores for Black borrowers were 125 points less than those for white borrowers in 2018. During that year, Black consumers were also more likely than white consumers not to have a credit record.
The Bottom Line
The history of credit scores is a fascinating tale of innovation, consumer protection, and the quest for a more accurate credit risk assessment. Today, credit scores are an integral part of the American economy, helping lenders make informed decisions and empowering consumers to understand and manage their financial health. As the credit landscape continues to evolve, we can expect further developments in credit scoring models and the factors they consider, ultimately leading to a more accurate and fair assessment of creditworthiness.