Facial Recognition Technology

Facial Recognition Technology Hasn’t Had a Good Week for News: Artificial Intelligence Trends

Reading a couple of stories yesterday, it became apparent to me that facial recognition technology hasn’t had a good week for news.

On January 22, 2022, two men barged into a Sunglass Hut in Houston and robbed the store at gunpoint. One of the men forced two terrified employees into a back storeroom and told them to stay until the criminals left. The men got away with thousands of dollars worth of cash and sunglasses and fled in a vehicle with a stolen license plate.

As Houston police detectives were investigating, Anthony Pfleger, the head of security for Sunglass Hut’s parent company EssilorLuxottica, contacted them to say they identified one of the robbers through facial recognition technology. Harvey Eugene Murphy Jr. (age 61) was their guy, Pfleger allegedly told the detectives. Murphy had also robbed the store on a previous occasion along with a nearby Macy’s, Pfleger said. One of the employees supposedly identified Murphy as the robber in a photo lineup. Houston police issued a warrant for Murphy’s arrest and he was taken into custody at the DMV when he went to renew his license.

Advertisement
Lexbe

According to an article, there’s only one problem with that arrest. Murphy was about 2,000 miles away on January 22, 2022 in Sacramento, California. He couldn’t have been involved in the robbery of the Sunglass Hut.

As reported by Law & Crime, Murphy filed a lawsuit against the owners of Sunglass Hut and Macy’s after he was falsely identified as a violent armed robber through facial technology. While the employee identified Murphy as the robber in a photo lineup, the lawsuit claims the employee was “prepped” before seeing the lineup.

Murphy hadn’t been arrested since the early 1990s. When a judge arraigned Murphy on the charges, he learned it occurred when he was in California. His court-appointed attorney relayed the information to prosecutors who dropped the charges.

Great, right?

Advertisement
Level Legal

Sadly, according to the lawsuit, a few hours before he was to be released, he was followed into a bathroom by three fellow inmates “was beaten, forced on the ground, and brutally gang raped”. Murphy is requesting at least $10 million.

Facial recognition technology in that case was at least operating from an actual face. According to this article from Wired, detectives working a 1990 cold case murder at the East Bay Regional Park District Police Department in 2017 got an idea and sent genetic information collected at the crime scene to Parabon NanoLabs—a company that says it can turn DNA into a face.

If you’re into true crime at all (like I am), you’ve probably heard of Parabon NanoLabs – their ability to use DNA from cold cases and apply genetic genealogy to it has solved countless cold cases, including identifying the Golden State Killer. More than 265 positive identifications in criminal cases have been achieved through their technology. And you may have seen that they can determine facial characteristics from DNA samples. Here’s an example of an age progression of a Snapshot Phenotype Report, which is a 3D rendering of what the suspect might look like:

Great, right?

Well, it is, until one of the detectives did something civil liberties experts say is problematic—and a violation of Parabon NanoLabs’ terms of service:

He asked to have the rendering run through facial recognition software.

Say what?

While it’s unknown whether the Northern California Regional Intelligence Center honored the East Bay detective’s request, the Wired article cites at least two other instances where detectives have considered applying facial recognition technology to 3D rendered images generated by Parabon Nanolabs.

“It’s really just junk science to consider something like this,” Jennifer Lynch, general counsel at civil liberties nonprofit the Electronic Frontier Foundation, told Wired. Running facial recognition with unreliable inputs, like an algorithmically generated face, is more likely to misidentify a suspect than provide law enforcement with a useful lead, she argues. “There’s no real evidence that Parabon can accurately produce a face in the first place,” Lynch says. “It’s very dangerous, because it puts people at risk of being a suspect for a crime they didn’t commit.”

BTW, this practice isn’t condoned by Parabon Nanolabs. In 2016, the company added a clause to its terms prohibiting customers from using facial recognition on its Snapshot Phenotype Reports (while acknowledging to Wired that the company “does not have a way to ensure compliance” with its terms of service).

When an AI algorithm hallucinates and delivers phony case citations that are put into a filing by a lawyer who didn’t adhere to their duty of competence, that’s one thing. When too much faith is put into AI algorithms that have shown to be less than reliable – or those algorithms are not used as intended – and someone is falsely accused of a crime and spends unwarranted time locked up, that’s a much bigger issue.

Facial recognition technology isn’t inherently bad – it has led to the identification of many criminals who might not otherwise have been caught and convicted. Trusting AI technology too much – without the due diligence to confirm whether the results are correct – is the problem – whether you’re a lawyer putting bogus case citations into a filing without checking them or law enforcement pursuing an arrest of a suspect without confirming whether they could have committed the crime. The problem is the same – it’s the stakes that are different.

So, what do you think? Is facial recognition technology the problem? Or is it how that technology is used? Please share any comments you might have or if you’d like to know more about a particular topic.

Image created using GPT-4’s Image Creator Powered by DALL-E, using the term “robot getting a facial recognition scan”.

Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.


Discover more from eDiscovery Today by Doug Austin

Subscribe to get the latest posts sent to your email.

Leave a Reply