Loan Think

CFPB should support the use of AI to reduce lending discrimination

Wall Street Frets Over A Revived CFPB Trump Left Toothless
"Regulators must make clear that mere acknowledgement of a less-discriminatory [consumer lending] model is not, alone, evidence of past wrongdoing," Yolanda D. McGill of Zest AI writes as part of her call for policies that promote continuous improvement of underwriting systems.
Ting Shen/Bloomberg

On March 30, 2023, Patrice Ficklin, head of the Consumer Financial Protection Bureau's Office of Fair Lending, publicly clarified for the first time that consumer lenders have an affirmative duty to monitor, refine and update lending models in order to ensure that there are no less-discriminatory models available. This statement is critical because pursuit of less-discriminatory alternative (LDA) underwriting models does not happen consistently enough for a variety of reasons, including that LDA searches have historically been cumbersome to pursue and may result in less accurate models. Fortunately for millions of Americans historically underserved by our financial system, new artificial intelligence and machine learning tools can facilitate more effective searches that yield multiple less-discriminatory and equally accurate alternative models quickly and efficiently.

Against this backdrop, Ficklin's clarification seems like a simple and clear affirmation of the Equal Credit Opportunity Act and its implementing regulation, Regulation B. Taken in conjunction with the bureau's warning to lenders against using technologies in ways that hamper compliance, the bureau's fair lending clarification could ultimately prove to be a watershed moment in advancing the use of AI in consumer finance to enhance fairness and financial inclusion. For this moment to be realized, however, regulators must take additional bold action, and more is needed to ensure that American consumers benefit from proper application of a law intended to increase fairness, inclusion and ultimately access to credit.

New products that have come to market since the start of 2023 include one that aims to ensure TRID compliance and another that handles image processing for appraisals.

May 12
Double exposure of table with computer on background and data theme drawing. Concept of innovation.

First, the bureau and other regulators should explicitly recognize that LDA search using AI tools is an advantageous application of the technology for financial services, given AI's ability to rapidly compare multiple models in searching for alternatives that are more fair and less discriminatory. Under the equal credit act, all lenders are required to assess whether current lending models have a discriminatory impact on protected classes, then ascertain whether there are LDAs available that would satisfy their legitimate business objectives. Advances in fair lending analytics are making these searches more accessible and efficient for all lenders, with significant benefits for consumers. 

Recent research published by the nonprofit FinRegLab highlighted the potential advantages of using AI tools in complying with LDA search requirements (as well as the risks of using AI without adequate attention to fairness). Advanced, explainable AI technologies for credit underwriting models that include robust searches for LDAs as part of the models' fair lending testing, foster fairness and inclusion in financial services.

Second, as my colleague argued in these pages last year, regulators must make clear that mere acknowledgement of a less-discriminatory model is not, alone, evidence of past wrongdoing. Today, whether due to a lack of sophistication in developing and testing alternative models, inertia or apathy, or fear that acknowledging an LDA may somehow indicate wrongdoing with respect to legacy models, many lenders fail to pursue robust LDA searches. Instead, lenders should be encouraged to perform robust LDA searches and improve models rather than stick with the status quo out of fear of incurring liability.

And finally, as we explained in our December 2020 comment letter, the bureau should issue public guidance regarding LDA regulatory expectations, including how the bureau assesses the robustness of LDA search techniques and methodologies. Clarity as to the material metrics or factors that lenders should consider in LDA search and deployment processes, and options for balancing fairness with accuracy would accelerate alignment with the bureau's express expectations. Coherent, compliant application of AI technology holds real promise for American consumers and the financial services providers who serve them.

For reprint and licensing requests for this article, click here.
Fair Housing Act Housing affordability Racial bias Originations
MORE FROM NATIONAL MORTGAGE NEWS