AI enables lenders to spot bias claims in customer complaints to CFPB

In one 2020 complaint to the Consumer Financial Protection Bureau, a consumer echoed the words of George Floyd to describe an experience with a financial company, saying "you all will not let me breathe." The consumer wanted to know why the firm would "not take their knee off ... my neck?"

Another criticized a company for its approach to sexual identity issues. “The employees refused to be sensitive to my pronouns’ and name change," the consumer said. "As a result, my account was closed after years of torture from this credit card company."

Such complaints showing up in the CFPB's public complaint portal could have reputational repercussions for a financial institution if they point to discriminatory employees inside an institution. But artificial intelligence offers solutions for firms to analyze complaint data to get ahead of risks.

“It only takes one complaint to drive your institution to where you don’t want to be,” said Marcia Tal, a former 25-year veteran of decision management at Citigroup who founded the New York artificial intelligence firm Tal Solutions.

Other AI firms analyze the CFPB data and some financial firms do it in-house, but Tal's company is one of the first to focus on using the CFPB's complaint portal to diagnose cases of bias in an institution's customer-facing teams.

Seeing Floyd's words like that in a complaint after his killing last year at the hands of Minneapolis police would raise an immediate risk management concern for a company, Tal said. She noted that similar language referencing Floyd's death had come up in six other complaints in the CFPB portal. (The names of institutions in CFPB complaints with such details are typically blocked out.)

Tal said firms should worry about potential signs of bias in connection with a consumer being denied a loan, but consumers also raise social justice concerns in complaints on credit reporting, managing a bank account, or struggling to pay bills due to COVID.

“We’re trying to get ahead of all these different pain points that are a risk to institutions,” Tal said.

One consumer posted a complaint in September 2020 referring to multiple companies that "will not let me breathe and they stubbornly refuse to take their knees off my neck." Another complaint filed in February 2021 stated: "And simply because they have their foot on my neck, they charge fees so that this account — which is all cash, loses value every month."

Using AI to identify signs of bias within an organization gives companies a head start to prepare for any action taken by the CFPB.

In the CFPB’s annual fair-lending report to Congress last week, acting Director Dave Uejio said the bureau’s top priorities include taking “bold and swift action to address issues of pervasive racial injustice and the long-term economic impacts of the COVID-19 pandemic on consumers.”

He vowed to find practical ways “to make freedom from racial prejudice and pursuit of racial equity a priority.”

And last month, Uejio warned that the bureau is analyzing disparities in how companies address complaints from minorities. He also put financial firms on notice that the CFPB will not tolerate discrimination against the LGBTQ community.

Experts say weeding out discrimination and addressing allegations of bias are critically important for risk managers given that the CFPB routinely uses complaints to look more closely at a financial institution’s operations.

"Complaints are a focal point for regulators and serve as a launchpad for investigations and enforcement against marketplace abuses,” said Heather McArn, a partner at Hinshaw & Culbertson and the former chief of staff and special counsel at the New York State Department of Financial Services. "Consumer complaints are a gold mine compliance opportunity for any institution."

Some AI experts said they were not surprised that Floyd’s words showed up in some CFPB complaints.

“It’s definitely the case that something that happens in society can trigger or awaken awareness of what’s going on" and Floyd’s death "raised awareness of these problems,” said Sanjib Kalita, the CEO of guppy.ai, a credit bureau startup built on blockchain technology. “A comment about ‘I can’t breathe,’ in a [consumer] complaint is very illuminating and also a tremendous opportunity for financial firms to be aware of some of these issues and how it might involve changing how they communicate certain things, or changing a product.”

Though many financial institutions analyze complaint data both internally and externally, Tal said the focus should be on substance over volume. Issues related to race, gender identity and mistrust of a financial institution need to be identified and even escalated to risk managers, she said. Most complaints about discrimination come from face-to-face interactions at bank branches, though they also appear in narratives describing the opening of home and auto loans, and checking and savings accounts, she said.

When companies fail to respond and find the root cause of a problem, consumers often copy and paste the same issue as a new complaint because it wasn't resolved the first time, and the customer moves to a higher level of frustration. Often the follow-up complaints state that the company is not taking them seriously or even mocking them, Tal said.

While complaints alleging discrimination or bias are infrequent, they can have an outsized impact on a company’s reputation. And Tal also noted that many of the complaints in 2021 “link the pandemic to a consumer’s health and the struggle to pay their bills.”

“We are trying to look at the most severe complaints to measure how frustrated the consumer is based on the language they use,” said Tal.

The AI platform launched by Tal's firm in 2019 to identify prejudice is known as PositivityTech. The platform analyzes language and phrases to determine a customer's frustration and whether it is severe. In addition to analyzing customer complaints, a company can also use the platform for in-house monitoring of employee complaints about the work environment.

These days, artificial intelligence increasingly is used to analyze specific words and even the tone used in consumer complaints. Using AI technology to manage risk has been embedded in most financial institutions’ compliance and risk management systems for years.

“Risk managers identify and rank-order where hot spots are from a compliance standpoint.,” said Cliff Rossi, a finance professor at the University of Maryland’s Robert H. Smith School of Business, and former chief risk officer at Citi, Washington Mutual Inc. and Countrywide Financial Corp.

Complaints that may appear unrelated to bias do not mean there is no reputational or business risk, Rossi said. Finding those specific complaints can be a challenge as well given that the CFPB received roughly 542,300 complaints last year and nearly 60% of them were about credit reporting, according to a recent CFPB report.

“All this information comes in and just like in the old days, you get people looking over complaints — and some might gloss over some important contextual comments that would have been addressed if they had been elevated earlier,” Rossi said.

Tal's firm also does more general analysis of the CFPB's complaint data, such as ranking the most common issues that come up in complaints from consumers that dealt with COVID-related setbacks.

Many see the use of AI technology in compliance as a way to assist financial firms in assessing complaints in real-time and resolving them quickly. Complaints to the CFPB are sent directly to financial institutions which generally have to respond within 15 days.

Regulators also are turning their attention to the uses of AI for operational, governance and risk management purposes. Five federal regulators including the CFPB issued a request for information last week on the uses of AI, including machine learning.

Still, it can be hard to pick out trends from the volume of consumer complaints. If a consumer wants to close an account, for example, a company should pay attention specifically to what happened and why, especially given the expense of attracting new customers. Debt collection is another pain point for institutions that can lead to lawsuits and enforcement actions, Tal said.

Companies also rely heavily on consumer feedback when launching new products and services.

“Hearing pain points from customers is really critical,” said Kalita. “When you get inbound complaints, it’s often hard to pick out trends from the volume of data and being able to automate that with intelligent algorithms raises the level of what an institution or a regulator can do.”

These consumer protection issues are now in focus for a group of regulators who are experienced, diverse and data-savvy, said McArn
"Firms are smart to invest in tools that can assist them in real-time assessment and resolution of those complaints," she said.

The CFPB’s complaint data is open-source and more than 35% of the narratives are public. By using AI to analyze and measure the actual language being used in complaints, companies can understand the biggest risks and try to repair a product or practice.
“The CFPB is not only sharing statistics, but also the narratives, the stories and the words,” Tal said. “It’s quite a rich data environment if institutions listen to what their customers are saying.”

For reprint and licensing requests for this article, click here.
Racial bias CFPB News & Analysis Servicing
MORE FROM NATIONAL MORTGAGE NEWS