Technology,
Real Estate/Development,
Civil Rights
Aug. 12, 2025
How algorithmic bias keeps renters out and puts fair housing to the test
Bias baked into tenant-screening algorithms is locking out Black, Latino, and immigrant renters, testing whether fair housing laws can withstand political rollbacks in the AI era.





Gary W. Rhoades
Civil Rights Attorney Specializing in Fair Housing
Email: garyrhoades2323@gmail.com
UC Davis SOL King Hall; Davis CA
Gary Rhoades was previously the Litigation Director of the Southern California Housing Rights Center and a Deputy City Attorney for Santa Monica, drafting and enforcing the City's laws against housing discrimination and tenant harassment. Gary is currently on the editorial board for the ABA's Human Rights Magazine and Special Counsel for the City of Inglewood's Housing Protection Department.

Few laws protected tenants from housing discrimination before the federal Fair Housing Act (FHA) became law in April of 1968. With overt in-your-face racism, landlords routinely and explicitly refused Black applicants. Black families were especially affected by discrimination, were forced to spend more time and money to find housing, and often ended up in overcrowded, unsafe neighborhoods.
In Los Angeles, rental housing discrimination, along with redlini...
For only $95 a month (the price of 2 article purchases)
Receive unlimited article access and full access to our archives,
Daily Appellate Report, award winning columns, and our
Verdicts and Settlements.
Or
$795 for an entire year!
Or access this article for $45
(Purchase provides 7-day access to this article. Printing, posting or downloading is not allowed.)
Already a subscriber?
Sign In