close
close

Why the police must stop using facial recognition technologies

Why the police must stop using facial recognition technologies

ÖOn January 9, 2020, officers from the Detroit Police Department (DPD) arrested me on my front yard in Farmington Hills, Michigan, in front of my wife and two young daughters for a crime I had nothing to do with. They wouldn’t tell me why, and I had to sleep that night on a cold concrete bench in a crowded, filthy jail cell before finally finding out I was falsely accused of stealing designer watches from a Detroit boutique.

During my interrogation, two detectives leaked that my arrest was based on false facial recognition, a technology that has been proven to be racist and flawed—especially when used in real-world conditions, such as blurry surveillance footage.

This week, we finally reached a settlement in my lawsuit against the City of Detroit for wrongful arrest that will ensure that what happened to me will not happen again.

Facial recognition technology has access to massive databases containing millions of photos — including, at the time of my arrest, a database of 49 million photos that includes every driver’s license photo from Michigan over the past few years. Anyone with a driver’s license can be added to these databases. The technology searches them all for similar-looking faces and spits out some possible suspects. Police would tell you they only use this information as a “lead” and then conduct meaningful investigations, but my own personal experience and that of other wrongfully arrested people across the country disprove that claim.

Case in point: The system somehow returned my expired driver’s license photo as an “investigative lead” that might match the thief. Instead of investigating the accuracy of this alleged match, police accepted the “lead” and placed my photo in a row with five other photos of black men – each of which looked fewer like the thief, since a computer algorithm had not decided this Photos looked similar enough to the thief to be a possible match. The witness (who hadn’t even seen the crime, just reviewed the surveillance footage) chose my photo from this doctored lineup. And that’s all the evidence the DPD relied on to arrest me.

Read more: Artificial intelligence has a problem with gender and racial bias. Here’s how to solve it

When I was finally released after 30 hours, I learned that my eldest daughter had lost her first tooth while I was in custody – a precious childhood memory that is now distorted by the trauma of our entire family. She also turned over a photo of our family because she couldn’t bear to see my face after watching the police take me away. The girls even started playing cops and robbers, telling me I was the robber. Over the last four years, there have been many moments when I have had to explain to two little girls that a computer had wrongfully put their father in prison.

The false charges were eventually dropped, but not before I had to go to court to defend myself against something I didn’t do. After they were dropped, I demanded an apology from the police officers and urged them to stop using this dangerous technology. They ignored me.

Since my story broke in 2020, we’ve learned of two other Black men in Detroit, Porcha Woodruff and Michael Oliver, who were also wrongfully arrested for crimes they didn’t commit because police relied on faulty facial recognition technology. Similar stories continue to emerge across the country.

In a fairer world, police would be banned from using this technology altogether. This settlement could not go that far, but police use of this dangerous and racist technology would now be much more tightly controlled. They would no longer be able to conduct a photo lineup based solely on a tip-off from facial recognition. Instead, they would only be able to conduct a lineup after using facial recognition if they first find independent evidence linking the person identified by facial recognition to a crime. In other words, police can no longer use facial recognition as a replacement for basic police investigative work.

But their obligations don’t end there. When DPD uses facial recognition in an investigation, it must inform courts and prosecutors of any deficiencies and weaknesses in the facial recognition search conducted, such as poor photo quality, as in my case when grainy surveillance footage was used. DPD must also provide its officers with initial training on the limitations and inaccuracies of facial recognition technology, including that it is much more likely to falsely identify black people than white people.

What the Detroit Police Department put me through changed my life forever. When I was sent to jail, I felt like I was in a bad movie that I couldn’t leave. In the years since my wrongful arrest, my family and I have traveled across Michigan and the country, calling on politicians to protect their constituents from the horror I witnessed by stopping law enforcement from abusing this technology. I have repeatedly stated that I do not want anyone to have to live with the fear and trauma that facial recognition technology has caused my family. With the resolution of my case, we are one big step closer to that goal.