Regulators cannot tolerate corporate actors using opaque artificial intelligence (AI) fed by inaccurate information to deny or gouge tenants on rental housing, the director of the federal consumer financial protection agency said Tuesday.
In remarks prepared for delivery for Community Table on a White House blueprint for a renter’s bill of rights, Consumer Financial Protection Bureau (CFPB) Director Rohit Chopra indicated his agency will be joining the Federal Housing Finance Agency (FHFA), the U.S. Department of Housing and Urban Development (HUD), and the U.S. Department of Agriculture (USDA) in making clear this week that landlords and property managers must let tenants know when they use screening reports to deny housing or raise fees or rent.
“Prospective tenants have a right to know this and to challenge false information,” Chopra said.
Chopra asserted that when algorithms used for decisions on housing, including rentals, “produce thumbs up or thumbs down decisions, without a second look, without applicant review, and without consideration of what information is true and what is false, the landlord, property manager, or tenant screening company may be breaking the law.” More specifically, he cited the Fair Credit Reporting Act (FCRA) as the law.
“We are on the lookout for inaccurate AI and illegal practices that lead to junk data,” Chopra said. “For example, relying on name matching alone is illegal because it is especially likely to result in inaccurate information. People with common last names are especially likely to be at risk from so-called ‘name-only’ matching.”
Chopra said no exemption is in the FCRA that allows companies to break the law because their AI or other technology doesn’t work.