An ACLU test of “Amazon Rekognition” facial identification tool falsely matched 28 members of Congress–disproportionally people of color–with criminal mugshots, documenting algorithmic bias, as covered in Interpersonal Divide in the Age of the Machine.
According to the ACLU, false matches included “six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).” The organization is calling on Congress to join its efforts to halt law enforcement use of face surveillance.
You can read a full account of the test as covered by National Public Radio.
The ACLU test paired Rekognition software with a database of 25,000 arrest photos and then searched that database against photos of current members of Congress.
The results were no surprise to those who have read the new edition of Interpersonal Divide, which has covered Amazon and Facebook since 2004, noting how revenue-generating apps like “Rekognition” have changed ethical and social norms at home, school and work.
Here’s an excerpt about algorithmic racism from the second edition:
If you believe that institutional racism exists, that systems and organizations over time believe falsehoods about under-represented groups, then imagine the long-term consequences if such bias is coded in and programmed into machines. For instance, if machines compile data suggesting that a certain race, gender and age of people living in a given location may have a higher inclination for wrongdoing, and that person happens to wander into a wealthier section of the neighborhood, merchants equipped with apps might be prone to mistake innocent shoppers for potential shoplifters, depriving them of service or worse, accusing them of crimes.
Interpersonal Divide documents algorithmic racism across digital platforms and datasets, including decisions associated with social justice, such as determining whether inmates should be granted parole. (Penal boards in half the states use algorithms in parole hearings.)