EXECUTIVE SUMMARY:
Earlier this year, facial recognition systems mistakenly identified Robert Julian-Borchak Williams as the perpetrator of a crime. As a result, Mr. Williams was arrested in front of his wife and children, and detained in a jail cell for 30 hours.
This case highlights the flaws in facial recognition technologies. To identify Mr. Williams, the Detroit police department used technology provided by DataWorks Plus, whose state contract is for $5.5 million, despite the fact that they provide biased technologies.
In 2019, the US government conclusively determined that the DataWorks Plus technology, among others, displayed significant algorithmic bias against black and Asian people. Since then, DataWorks implemented a new algorithm that “tightens the differences in accuracy between different demographic cohorts.”
But evidently, the new algorithm contains serious flaws. In addition, police investigators in the case failed to seek out corroborating evidence that pointed to Mr. Williams, using only an AI-based photo lineup and a loss-department employee’s memory to identify him.
When the Williams family contacted legal representatives, most presumed him guilty, and quoted high prices for representation.
“We’ve been active in trying to sound the alarm bells around facial recognition, both as a threat to privacy when it works and a racist threat to everyone when it doesn’t,” states Phil Mayor, a lawyer for the American Civil Liberties Union.
Although the courts have dismissed the case, questions around the use of facial recognition systems remain.
For more on this story, visit The New York Times.