“Mommy’s going to jail.”
That’s what 32-year-old Porcha Woodruff told her two daughters as she was being handcuffed in front of her home last February. She had been getting her girls ready for school when six police officers showed up at her residence in Detroit. They asked her to step outside because she was under arrest for robbery and carjacking.
In disbelief, Porcha “gestured at her stomach to indicate how ill-equipped she was to commit such a crime: She was eight months pregnant.” But no amount of pleading could convince the officers of her innocence. Leaving her whimpering girls with her fiancé, she was taken to the Detroit Detention Center.
Porcha Woodruff had become the third Detroit resident to be wrongly arrested based on a new technology the city implemented in 2019: computer facial recognition.
A Powerful but Dangerous Tool
The Detroit Police Department “uses a facial recognition vendor called DataWorks Plus to run unknown faces against a database of criminal mug shots.” In Porcha’s case, a mug shot from her arrest in 2015 for driving with an expired license was among the photos the software matched with gas station surveillance footage of the actual perpetrator. The carjacking victim, a 25-year-old male, then mistakenly chose Porcha’s mug shot from a “six-pack photo lineup.”
According to Gary Wells, a psychology professor who has studied the reliability of eyewitness identifications, pairing artificial intelligence with human judgment “is circular and dangerous. You’ve got a very powerful tool that, if it searches enough faces, will always yield people who look like the person on the surveillance image.” Dr. Wells further explained that the technology compounds an existing problem with eyewitnesses: “They assume when you show them a six-pack, the real person is there.”
Even more disturbing, some facial recognition vendors have more than mug shots in their databases. In September 2019, the same time the Detroit PD began using DataWorks Plus, a sheriff’s office in Louisiana began a $25,000 yearly subscription to Clearview AI—a vendor that has “scraped billions of photos from the public web, including social media sites, to create a face-based search engine now used by law enforcement agencies.”
Last year, a 29-year-old man with many photos on LinkedIn and Facebook ended his Thanksgiving weekend by being extradited from Georgia to Louisiana. Thanks to Clearview AI, he spent nearly a week in jail for allegedly stealing designer purses in a state he had never even visited.
Shoddy Technology vs. Shoddy Investigations
On August 3, Porcha and her attorney filed a lawsuit against the Detroit Police Department, contending that “AI technology was the reason she was falsely identified as a carjacking suspect.” Responding in a press conference on August 9, Police Chief James White blamed the wrongful arrest on “investigative lapses, not faulty facial recognition technology.”
White explained that the detective on the case failed to follow the department’s facial recognition policy, which states that a software match “shall be considered an investigative lead,” not a cause for arrest, “and the requesting investigator shall continue to conduct a thorough and comprehensive investigation.”
In Porcha’s case, no such investigation followed the software match of her mug shot with the surveillance footage—mainly, the detective asking the victim if the female perpetrator (who had a male accomplice) appeared pregnant! That description alone, according to the police chief, should have eliminated Porcha as a suspect.
But what if the accused hadn’t been eight months pregnant? “I would probably be fighting a case right now that’s not mine,” Porcha told CBS News on August 10. She and her attorney are sticking to their claim that “shoddy technology,” as well as shoddy detective work, is to blame for her arrest. According to the American Civil Liberties Union of Michigan, the first makes room for the second, “and police assurances that they will conduct serious investigations do not ring true.”
God’s Facial Recognition Software
The Bible describes another kind of facial recognition technology, one that matches us to our sins with perfect precision: God’s law of love, divided into two tablets (Matthew 22:37–40).
James talks about the second tablet, “You shall love your neighbor as yourself” (2:8), when he mentions the commandments “Do not commit adultery” and “Do not murder” (v. 11). In chapter 1, he describes how the software works:
“If anyone is a hearer of the word and not a doer, he is like a man observing his natural face in a mirror; for he observes himself, goes away, and immediately forgets what kind of man he was. But he who looks into the perfect law of liberty and continues in it, … this one will be blessed in what he does” (vv. 23–25).
In other words, God’s law is a mirror that shows us our sins—written on our faces! It’s a fact that attitudes, even before being expressed in gestures or words, exhibit themselves in facial “microexpressions.” Thus, when Cain became “very angry, … his countenance fell” (Genesis 4:5). Wanting Cain to see his sin before it led to the crime of murder, God held up His mirror: “Why are you angry? And why has your countenance fallen? If you do well, will you not be accepted? And if you do not do well, sin lies at the door” (v. 6).
Sadly, Cain turned away from “the perfect law of liberty” and slew his brother Abel.
When Porcha Woodruff’s face was mistakenly chosen based on a faulty software match, she spent 11 hours sitting on a concrete bench in a holding cell. But God’s facial recognition software produces no mistaken matches. If we don’t allow His righteousness to replace the sins we see in the mirror, where will we spend eternity?
Want to spend it with God? Watch “5 Steps to Eternity” to see what Pastor Doug says about receiving the gift of eternal life.