DNA + Facial Recognition Technology = Junk Science

Psychological Assessment in Legal Contexts: Are Courts Keeping “Junk Science”  Out of the Courtroom? – Association for Psychological Science – APS

Intriguing article in Wired featured a story where police used DNA to predict a suspect’s face and then tried running facial recognition technology on the photo.

BACKGROUND FACTS

In 2017, detectives working a cold case at the East Bay Regional Park District Police Department got an idea, one that might help them finally get a lead on the murder of Maria Jane Weidhofer. Officers had found Weidhofer, dead and sexually assaulted, at Berkeley, California’s Tilden Regional Park in 1990. Nearly 30 years later, the department sent genetic information collected at the crime scene to Parabon NanoLabs—a company that says it can turn DNA into a face.

Soon, Parabon NanoLabs provided the police department with the face of a potential suspect, generated using only crime scene evidence.

The image Parabon NanoLabs produced, called a Snapshot Phenotype Report, wasn’t a photograph. It was a 3D representation of how the company’s algorithm predicted a person could look given genetic attributes found in the DNA sample.

The face of the murderer, the company predicted, was male. He had fair skin, brown eyes and hair, no freckles, and bushy eyebrows. A forensic artist employed by the company photoshopped a nondescript, close-cropped haircut onto the man and gave him a mustache—an artistic addition informed by a witness description and not the DNA sample.

In 2017, the department published the predicted face in an attempt to solicit tips from the public. Then, in 2020, one of the detectives  asked to have the rendering run through facial recognition software. It appears to be the first known instance of a police department attempting to use facial recognition on a face algorithmically generated from crime-scene DNA.

At this point it is unknown whether the Northern California Regional Intelligence Center honored the East Bay detective’s request.

DOES THIS SEARCH VIOLATE CONSTITUTIONAL RIGHTS?

Some argue this search emphasizes the ways that law enforcement is able to mix and match technologies in unintended ways. In short, this search uses untested algorithms to single out suspects based on unknowable criteria.

“It’s really just junk science to consider something like this,” Jennifer Lynch, general counsel at civil liberties nonprofit the Electronic Frontier Foundation, tells WIRED. Running facial recognition with unreliable inputs, like an algorithmically generated face, is more likely to misidentify a suspect than provide law enforcement with a useful lead, she argues.

“There’s no real evidence that Parabon can accurately produce a face in the first place . . . It’s very dangerous, because it puts people at risk of being a suspect for a crime they didn’t commit.” ~Jennifer Lynch, General Counsel at Electronic Frontier Foundation.

According to a report released in September by the US Government Accountability Office, only 5 percent of the 196 FBI agents who have access to facial recognition technology from outside vendors have completed any training on how to properly use the tools. The report notes that the agency also lacks any internal policies for facial recognition to safeguard against privacy and civil liberties abuses.

In the past few years, facial recognition has improved considerably. In 2018, when the National Institute of Standards and Technology tested face recognition algorithms on a mug shot database of 12 million people, it found that 99.9 percent of searches identified the correct person. However, the NIST also found disparities in how the algorithms it tested performed across demographic groups.

A 2019 report from Georgetown’s Center on Privacy and Technology was written by Clare Garvie, a facial recognition expert and privacy lawyer. She found that law enforcement agencies nationwide have used facial recognition tools indiscriminately. They’ve tried using images that include blurry surveillance camera shots, manipulated photos of suspects, and even composite sketches created by traditional artists.

“Because modern facial recognition algorithms are trained neural networks, we just don’t know exactly what criteria the systems use to identify a face . . . Daisy chaining unreliable or imprecise black-box tools together is simply going to produce unreliable results. We should know this by now.” ~ Clare Garvie, Esq.

Please contact my office if you, a friend or family member are charged with a crime. Hiring an effective and competent defense attorney is the first and best step toward justice.