“Last week we saw yet another reminder of the ways algorithms will perpetuate historical bias if left unchecked and unrestrained. The Department of Housing and Urban Development’s (HUD) proposed rule released Monday announced the intent to reduce key protections afforded to consumers under the Fair Housing Act. The new rule would revise the HUD interpretation of Title VIII of the Civil Rights Act of 1968 to eliminate the disparate impact standard, which prohibits policies or procedures that appear to be neutral but actually result in disproportionate adverse impact on protected groups.

“The proposed rule also eliminates a key protection for consumers by shielding institutions that use algorithms resulting in bias as long as the challenged model is produced, maintained, or distributed by a recognized third party. Many of the AI-based financial products used by institutions today are not created in-house, but rather by outside vendors. Thus, if a bank were to use an AI product to determine whether to award a loan, and it turned out that the product employed algorithms that result in race- and gender-based loan decisions, the bank would not be liable for its racist, sexist practices. This not only limits consumers’ ability to defend themselves from discriminatory practices, but also eliminates incentive for institutions to investigate whether the algorithms are discriminatory in the first place…”

Read the full article here: