Aggregated News

a photo of Dorothy Roberts

A shocking story of wrongful arrest in Detroit has renewed scrutiny of how facial recognition software is being deployed by police departments, despite major flaws in the technology. Porcha Woodruff was arrested in February when police showed up at her house accusing her of robbery and carjacking. Woodruff, who was eight months pregnant at the time, insisted she had nothing to do with the crime, but police detained her for 11 hours, during which time she had contractions. She was eventually released on a $100,000 bond before prosecutors dropped the case a month later, admitting that her arrest was based in part on a false facial recognition match. Woodruff is the sixth known person to be falsely accused of a crime because of facial recognition, and all six victims have been Black. “That’s not an accident,” says Dorothy Roberts, director of the University of Pennsylvania Program on Race, Science and Society, who says new technology often reflects societal biases when built atop flawed systems. “Racism gets embedded into the technologies.”

Transcript

AMY GOODMAN: Professor Roberts, I wanted to end by asking...