Close Menu
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
What's Hot

United MileagePlus Dining guide

January 31, 2025

CMLS introduces Aveo Flex 40, Canada’s newest 40-year mortgage

January 31, 2025

Some Considerations on OPM’s Deferred Resignation Program

January 31, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
InfinBudget
Subscribe
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
InfinBudget
Home»Banking»Your AI credit models are fine, but their training data is problematic
Banking

Your AI credit models are fine, but their training data is problematic

December 4, 2024No Comments4 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Your AI credit models are fine, but their training data is problematic
Share
Facebook Twitter LinkedIn Pinterest Email

AI systems built to assess creditworthiness are trained on data that implicitly accepts past discriminatory lending decisions as legitimate signals about borrowers today, writes Deon Crasto, of Velocity Global.

Adobe Stock

The promise of artificial intelligence in lending offers faster decisions and broader access to credit, but it often perpetuates existing inequities. Be wary: Your AI lending model might not be as fair and objective as it appears.

Don’t believe me? Let’s look at a few instances. First, car loans — researchers at the University of Bath reported that women were more likely to be disproportionately favored for loan originations as opposed to their male counterparts, even while controlling other financial factors. Oh, and with mortgages, we see a very similar story. A 2024 examination leveraging leading large language models to determine creditworthiness found that Black applicants were at a higher risk of being denied as compared to their white counterparts. And it’s not just race or color. It expands across age, postal codes and even the college you attended. 

At the end of the day, lenders are looking for deterministic factors to underwrite products — and that’s what’s going on here. I know all too well. I ran the product for the data science and decisioning team at Ondeck Capital and we looked at every data point we could get our hands on. And I mean it. Got a bad Yelp rating? It was accounted for in our model. Your FourSquare check-ins were down? Oh, we know. We even considered factors like seasonality in cash flow and how businesses in your neighborhood were doing. Our machine learning, or ML, models were designed to process thousands of data points to make lending decisions in seconds. 

See also  What GOP control of Washington means for Sallie Mae loans

But I’m here to give you an alternate narrative. I think your AI models are fine (for now), but your data is fundamentally flawed. The issue isn’t in the algorithms themselves, but in the historical data we’re feeding them. You see, models are trained on datasets that literally go back decades. So, if a certain group has historically been denied loans at higher rates, ML models will subconsciously associate this with “high risk.” The model doesn’t know it’s being unfair; it’s simply learning from the patterns we’ve provided.

The problem is exacerbated by what we in the industry call “thin files” — credit reports with limited history. This disproportionately affects young adults and recent immigrants — arguably two groups most in need of access to credit. The alternative is to take on loans, often ones people cannot afford and on unfavorable terms, to build up credit, creating a Catch-22 situation that can trap people in a cycle of debt.

The impact of thin files on creditworthiness is staggering. According to a recent study by LexisNexis, banks in the U.K. could be denying loans to 80% of adults with thin credit files, often low-risk customers. These applications typically deemed high risk by traditional lending models would often, in theory, be auto-declined through “hard cuts,” a process where applications are eliminated based on specific criteria deemed necessary for approval. If these criteria are missing, models typically disregard any other relevant financial information — often with more exhaustive data points, effectively shutting out individuals from credit. 

So how do we solve this? I’d argue the future of our lending models needs to account for a more holistic picture to determine creditworthiness. We need to diversify our data sources, implement rigorous back-testing for biases and make our models as transparent as possible. Transparency should also extend to the consumer, by allowing them to understand the factors influencing their creditworthiness.

See also  How Do Medical Bills Affect Your Credit?

It shouldn’t just be about smarter algorithms. It should be about smarter, fairer and more complete data. And on some level, it’s about ensuring algorithmic accountability — and the ethical application of AI and ML in products that have a broad-reaching impact on society.

Source link

credit data fine models problematic training
Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email
Previous ArticleA Health Reimbursement Account Is Another Tax-Advantaged Another Way to Pay Medical Expenses
Next Article How To Make An AI-Proof Small Business Fortune

Related Posts

CFPB sues Comerica over abuses of federal benefits users

December 8, 2024

How long does it take for Series EE bonds to mature?

December 7, 2024

Fed’s Bowman: Regulators ‘have not stepped up’ to stop fraud

December 7, 2024
Add A Comment
Leave A Reply Cancel Reply

Top Posts

How China’s consumers are spending this Singles Day shopping festival

November 4, 2024

Best-performing energy stocks: October 2024

October 13, 2024

Can A Debt Collector Keep Calling My Employer?

October 25, 2024
Ads Banner

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

Stay informed with our finance blog! Get expert insights, money management tips, investment strategies, and the latest financial news to help you make smart financial decisions.

We're social. Connect with us:

Facebook X (Twitter) Instagram YouTube
Top Insights

United MileagePlus Dining guide

January 31, 2025

CMLS introduces Aveo Flex 40, Canada’s newest 40-year mortgage

January 31, 2025

Some Considerations on OPM’s Deferred Resignation Program

January 31, 2025
Get Informed

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

© 2025 InfinBudget.com - All rights reserved.
  • Contact
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.