On March 30, 2023, Patrice Ficklin, head of the Consumer Financial Protection Bureau's Office of Fair Lending, publicly clarified for the first time that consumer lenders have an affirmative duty to monitor, refine and update lending models in order to ensure that there are no less-discriminatory models available. This statement is critical because pursuit of less-discriminatory alternative (LDA) underwriting models does not happen consistently enough for a variety of reasons, including that LDA searches have historically been cumbersome to pursue and may result in less accurate models. Fortunately for millions of Americans historically underserved by our financial system, new artificial intelligence and machine learning tools can facilitate more effective searches that yield multiple less-discriminatory and equally accurate alternative models quickly and efficiently.
Against this backdrop, Ficklin's clarification seems like a simple and clear affirmation of the Equal Credit Opportunity Act and its implementing regulation, Regulation B. Taken in conjunction with the bureau's warning to lenders against using technologies in ways that hamper compliance, the bureau's fair lending clarification could ultimately prove to be a watershed moment in advancing the use of AI in consumer finance to enhance fairness and financial inclusion. For this moment to be realized, however, regulators must take additional bold action, and more is needed to ensure that American consumers benefit from proper application of a law intended to increase fairness, inclusion and ultimately access to credit.
New products that have come to market since the start of 2023 include one that aims to ensure TRID compliance and another that handles image processing for appraisals.
First, the bureau and other regulators should explicitly recognize that LDA search using AI tools is an advantageous application of the technology for financial services, given AI's ability to rapidly compare multiple models in searching for alternatives that are more fair and less discriminatory. Under the equal credit act, all lenders are required to assess whether current lending models have a discriminatory impact on protected classes, then ascertain whether there are LDAs available that would satisfy their legitimate business objectives. Advances in fair lending analytics are making these searches more accessible and efficient for all lenders, with significant benefits for consumers.
Recent research published by the nonprofit FinRegLab highlighted the potential advantages of using AI tools in complying with LDA search requirements (as well as the risks of using AI without adequate attention to fairness). Advanced, explainable AI technologies for credit underwriting models that include robust searches for LDAs as part of the models' fair lending testing, foster fairness and inclusion in financial services.
Second, as my colleague argued in these pages last year, regulators must make clear that mere acknowledgement of a less-discriminatory model is not, alone, evidence of past wrongdoing. Today, whether due to a lack of sophistication in developing and testing alternative models, inertia or apathy, or fear that acknowledging an LDA may somehow indicate wrongdoing with respect to legacy models, many lenders fail to pursue robust LDA searches. Instead, lenders should be encouraged to perform robust LDA searches and improve models rather than stick with the status quo out of fear of incurring liability.
And finally, as we explained in our December 2020 comment letter, the bureau should issue public guidance regarding LDA regulatory expectations, including how the bureau assesses the robustness of LDA search techniques and methodologies. Clarity as to the material metrics or factors that lenders should consider in LDA search and deployment processes, and options for balancing fairness with accuracy would accelerate alignment with the bureau's express expectations. Coherent, compliant application of AI technology holds real promise for American consumers and the financial services providers who serve them.
The Consumer Financial Protection Bureau issued separate policy statements on "sandbox approvals" and no-action letters for fintechs — measures whose longevity is questionable with the incoming Trump administration.
The former president of the servicing giant stepped down from his previous role in early 2024 but continued to hold board positions at parent and partner businesses.
Even though senior home values are still near the all-time high, other data shows fewer are turning to reverse mortgages to ease a potential financial crunch.
In a speech outlining his priorities for the FDIC, Vice Chair Travis Hill stressed the need for a more flexible regulatory approach, addressing capital requirements, digital assets, climate policy, and bank oversight, while emphasizing transparency and timely action.
Verisk and Moody's are the first two modeling companies to submit their models to the California Department of Insurance. A Verisk executive explained the functioning of their model and what insurers' options are.
A surge in overall employment added to upward pressure on the 10-year Treasury yield, likely making the industry think twice about its cautious payroll growth.