A state commission tasked with protecting civil rights is now tackling discrimination and bias by artificial intelligence.
The Michigan Civil Rights Commission passed a resolution Monday to establish a set of guiding principles to prevent AI from being the only tool used to evaluate people for things like housing, hiring or insurance.
For landlords and property managers, AI is becoming an essential tool to sift through tenant applications and background checks, but state officials say it can also cause unintended discrimination.
“If the AI system comes up with a score that would make the person ineligible or not suitable for being a renter, that person should have recourse and say: ‘Hey! I'd like a human to review my application,” said Gloria Lara, chair of the Michigan Civil Rights Commission.
Recently, AI software used to automate the tenant selection process has come under scrutiny for allegedly hiking rent prices. In August the U.S. Department of Justice announce it was suing the real estate company RealPage for allowing landlords of multifamily homes to use its AI algorithm to set rents above market rate.
“The public should be protected from abusive data practices and should have agency over how data about them is used,” said Lara
RealPage’s software is currently used by several mid-Michigan property management companies, including Lansing’s Gillespie Group, according to the company’s website. Gillespie Group is in charge of hundreds of residential properties including Westbury Lake Apartments in Eaton County and Prudden Place in Ingham.
The Michigan Civil Rights Commission’s states, “there is substantial evidence that unless purposefully addressed the tendency of AI systems to incorporate biased and discriminatory data will result in the perpetuation of discriminatory outcomes with serious implications for the civil rights of Michigan citizens.”
Discrimination by AI is not new, in fact back in 2019, a study published in Science uncovered that a healthcare algorithm used to identify patients who were the sickest was more likely to flag white patients for additional medical attention than Black patients who were actually sicker.
“The public should know that an AI empowered automated system is being used and to understand how and why it contributes to outcomes that impacts them,” added Lara.
This measure is the second of its kind for the Michigan Civil Rights Commission, in April the group adopted recommendations for the use of AI in policing.