Police and federal agencies have found a new type of AI to skirt the growing patchwork of laws that curb how they use facial recognition.
According to Project Counsel Media (How a new type of AI is helping police skirt facial recognition bans, available here), police and federal agencies have found a controversial new way to skirt the growing patchwork of laws that curb how they use facial recognition: an AI model that can track people using attributes like body size, gender, hair color and style, clothing, and accessories.
The authors “recently saw a demo of the technology courtesy of one of our media partners, the MIT Technology Review.” The tool, called Track and built by the video analytics (and eDiscovery) company Veritone, is used by 400 customers, including state and local police departments and universities all over the US.
Use of the tool is also expanding at the U.S. Federal level. The Department of Justice began using Track for criminal investigations last August. Veritone’s broader suite of AI tools, which includes bona fide facial recognition, is also used by the Department of Homeland Security – which houses immigration agencies – and the Department of Defense.
As you can imagine, the product has drawn criticism from the American Civil Liberties Union, which – after learning of the tool through MIT Technology Review – said it was the first instance they’d seen of a nonbiometric tracking system used at scale in the U.S. They warned that it raises many of the same privacy concerns as facial recognition but also introduces new ones at a time when the Trump administration is pushing federal agencies to ramp up monitoring of protesters, immigrants, and students.
The demonstration of Track analyzed people in footage from different environments, ranging from the January 6th Capitol riots, to subway stations. You can use it to find people by specifying body size, gender, hair color and style, shoes, clothing, and various accessories. The tool can then assemble timelines, tracking a person across different locations and video feeds. It can be accessed through Amazon and Microsoft cloud platforms.
As the article notes: “When asked if Track differentiates on the basis of skin tone, a company spokesperson said it’s one of the attributes the algorithm uses to tell people apart but that the software does not currently allow users to search for people by skin color.”
Track’s expansion comes as laws limiting the use of facial recognition have spread, sparked by wrongful arrests in which officers have been overly confident in the judgments of algorithms. Numerous studies have shown that such algorithms are less accurate with nonwhite faces. Laws in Montana and Maine sharply limit when police can use it – it’s not allowed in real time with live video – while San Francisco and Oakland, California have near-complete bans on facial recognition.
Track provides an alternative.
And it brings up a point made many times before: Though such laws often reference “biometric data” the phrase is far from clearly defined. It generally refers to immutable characteristics like faces, gait and fingerprints rather than things that change, like clothing. But certain attributes, such as body size, blur this distinction.
As the article notes, advancements in AI technology have enabled automation and expedited analysis of video evidence, which is why the NYC police were able to track Luigi Mangione so quickly, the man who is alleged to have shot and killed the CEO of UnitedHealthcare, Brian Thompson.
Expect a battle between law enforcement and civil liberties advocates over this new type of AI, which will likely be just as fierce as the battle over facial recognition. As usual, a big part of the battle will be over how it’s used, not whether it’s used. Stay tuned.
So, what do you think? Are you concerned about AI tracking of people on video? Please share any comments you might have or if you’d like to know more about a particular topic.
Image created using Microsoft Designer, using the term “robot walking in a crowd with a dark coat and dark glasses”.
Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.
Discover more from eDiscovery Today by Doug Austin
Subscribe to get the latest posts sent to your email.




