Justice in Forensic Algorithms

The Justice in Forensic Algorithms Act: Legal Technology Trends

Hat tip to Tom O’Connor for a great topic suggestion! The Justice in Forensic Algorithms Act aims to ensure that when algorithmic analyses are used as evidence in court, defendants get to know how the tools reached their conclusions.

Last year, Rep. Mark Takano (D-Calif.) and Rep. Dwight Evans (D-Penn.) reintroduced the Justice in Forensic Algorithms Act to ensure that defendants have access to source code and other information necessary to exercise their confrontational and due process rights when algorithms are used to analyze evidence in their case. The legislation is also designed to establish standards and testing to enable a robust conversation about how these algorithms work and whether they are accurate and fair enough to be used in the criminal justice system.

The use so-called forensic algorithms for purposes like helping law enforcement agencies compare DNA samples and fingerprints from the scene against those of potential culprits and identifying faces from photos have limitations that are not always made clear, as reported in this article. In a 2021 report, the Government Accountability Office (GAO) said analysts and investigators have run into challenges around bias, “misuse” and “difficulty interpreting and communicating results.”

Algorithms may be used in various stages of the criminal justice system, from influencing parole decisions to pretrial ones, as well as providing evidence assessments used during trials. That same article references the use of the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) algorithm, which has come under scrutiny over the years for its tendency to rate the possibility for black defendants to reoffend at much higher score than non-black defendants (which was discussed by Maura R. Grossman in an excellent Legalweek webinar last year, which I covered here on the IPRO blog).

Back in 2016, I wrote about the case of Eric Loomis, who was deemed a high risk of re-offending by COMPAS, which was upheld by the Wisconsin Supreme Court on appeal. A big issue with COMPAS has been the lack of transparency regarding the algorithm and the desire of the developers of COMPAS to protect that as intellectual property.

In short, the Justice in Forensic Algorithms Act is designed to open the “black box” of forensic algorithms by:

  1. Prohibiting the use of trade secrets privileges to prevent defense access to source code and other information about software used to process, analyze, and interpret evidence in criminal proceedings;
  2. Directing the National Institute of Standards and Technology to establish both Computational Forensic Algorithm Testing Standards and a Computational Forensic Algorithm Testing Program; and
  3. Requiring Federal law enforcement to comply with standards and testing requirements in their use of forensic algorithms

Could those measures by the Justice in Forensic Algorithms Act extend to the civil arena? Well, I asked Tom for his thoughts on the topic, and he said: “If this proposed statute ever becomes law, I’ll be interested to see how long it takes for the concept to bleed into the civil arena. Much as the FRCP standards for document exchange eventually were used to generate an agreed upon protocol between the US Attorney and the Federal Defenders, I would expect Plaintiffs groups such as the AAJ or highly skilled individual Plaintiffs counsel will push to start using the statutes precepts in civil cases as a matter of principle. FRCP 1 calls for all matters to be handled in a manner that is ‘…just, speedy and inexpensive’ and I expect arguments to be mounted that only a determination that AI or other TAR algorithms are not biased will satisfy the first of those three requirements.”

Veritas

Interesting. Expect a titanic battle between those pushing for algorithm transparency and those pushing to keep IP secrets.

So, what do you think? Do you think AI and machine learning algorithms should be transparent or do you think developers have a right to protect that information as part of their intellectual property? Please share any comments you might have or if you’d like to know more about a particular topic.

Disclaimer: The views represented herein are exclusively the views of the author, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.

2 comments

Leave a Reply