A credit score is a mathematical formula designed to tell a company how likely a person is to pay off a loan completely and on-time. Companies use them to make decisions on whether a person should be offered a mortgage, credit card, auto loan or other credit product.
Credit scores are frequently a source of frustration for many social media users, and people’s complaints about the reliability of this widely used credit scoring system have been shared hundreds of thousands of times.
One recurring claim about credit scores is that the universal credit score used today wasn't invented until 1989.
THE QUESTION
Were credit scores as we know them today invented in 1989?
THE SOURCES
- FICO
- Equifax
- Fair Credit Reporting Act
- Equal Credit Opportunity Act
- John Ulzheimer, a credit expert who writes for badcredit.org
THE ANSWER
Yes, Fair, Isaac and Company, now known as FICO, created its universal credit score in 1989; the FICO score is used in the majority of lending decisions today. However, credit reporting and more industry-specific credit scoring existed before the universal credit score.
WHAT WE FOUND
Credit bureaus have existed in the United States since the 1800s. These bureaus put together reports on a consumer’s lending history, and until government regulations in the 1970s, included personal information like marital status, race and gender. Companies began developing and using scores specific to certain credit industries, such as credit cards, in the mid-to-late 1900s, but there wasn’t a universal score that was uniformly applied to all lending situations until 1989.
The first universal, personal credit score was created by Fair, Isaac and Company, now known as FICO, in 1989. FICO was unique because its credit score was universal — one score that could be applied to all credit offers regardless of industry. The FICO score, which ranges between 300 and 850, is used today in the vast majority of lending decisions.
“We launched the FICO Score in 1989 as a universal and impartial tool for evaluating credit risk, and in 1991, it became available from all three major U.S. credit reporting agencies,” a FICO employee said in a company blog post. “We weren’t the first or only scoring option available at all three major credit bureaus, but we were the first with a common design blueprint.”
Credit reporting, and even industry specific credit scoring, had existed long before then, but credit scores were often individualized for certain industries and would be based on different criteria from company to company.
Credit-reporting bureaus were first established in the mid-1800s to collect information on consumers’ lending histories, although they were often small and locally based, said credit expert John Ulzheimer in an article on badcredit.org. One such bureau, founded in Atlanta in 1899 as Retail Credit Company, later became Equifax, one of today’s three major credit-reporting bureaus.
But businesses often made decisions on whether to grant a person loans based on subjective character judgments and biases throughout much of America’s history, even through the late 1900s.
“Early credit reporting wasn’t without its problems,” Ulzheimer said. “It was often subjective, unfair, and didn’t lend itself to consistent credit decisions. In response to these problems, Congress passed a series of laws designed with consumer protection in mind.”
The first of those laws was the Fair Credit Reporting Act in 1970, which required credit reports to be accessible to the people they apply to, put time limits on negative information included in credit reports and limit who can look at a person’s credit report. The second was the 1974 Equal Credit Opportunity Act, which is a civil rights law to prevent lenders from discriminating by race, religion, gender and a host of other personal characteristics that are irrelevant to a person’s lending history. These laws also worked to limit what information on consumers credit-reporting bureaus could keep.
At this time, the company that is now FICO had been working on developing automated credit scoring in an attempt to reduce or remove the subjectivity in lending decisions. It built its first credit scoring system for the American investment industry back in 1958, just two years after its founding. Their early scores were for specific industries or companies, such as scores for credit card lenders to base decisions on.
In 1989, FICO built the BEACON score for Equifax; Equifax still calls its version of the FICO score BEACON to this day. All three major credit bureaus began using FICO scores by 1991. FICO claims its scores are used by 90% of top lenders today.
But today’s credit scores are still plagued by many of the problems that have long existed in credit reporting. Although each credit bureau uses the same formula or very similar formulas, the data they keep in their reports can be different — and sometimes inaccurate. Credit scores can also continue reinforcing the biases it was meant to eliminate.
“A FICO score is probably a more impartial way to handle credit approval than just having some bank representative make a superficial judgment about potential applicants,” said the writers at financial management education blog OppU. “But algorithms can actually reinforce racial disparities that already exist.”
People without credit scores, estimated to be 45 million Americans, tend to be low-income, younger, and minorities, the U.S. Government Accountability Office said in Jan. 2022. That’s led the Consumer Financial Protection Bureau to explore more widespread inclusion of alternative data, such as rent and utility payments, in credit reporting.
More from VERIFY: Yes, crude oil prices fell significantly but gas prices did not