Reps. Takano and Evans Reintroduce the Justice in Forensic Algorithms Act to Protect Defendants’ Due Process Rights in the Criminal Justice System
Riverside, CA – Today, Rep. Mark Takano (D-Calif.) and Rep. Dwight Evans (D-Penn.) reintroduced the Justice in Forensic Algorithms Act to ensure that defendants have access to source code and other information necessary to exercise their confrontational and due process rights when algorithms are used to analyze evidence in their case. This legislation will also establish standards and testing to enable a robust conversation about how these algorithms work and whether they are accurate and fair enough to be used in the criminal justice system.
“The trade secrets privileges of software developers should never trump the due process rights of defendants in the criminal justice system,” said Rep. Mark Takano. “As technological innovations enter our criminal justice system, we must ensure that they don’t undermine defendants’ rights. One of those technologies being used more and more are forensic algorithms. Forensic algorithms are black boxes, and we need to be able to look inside that black box to understand how the software works and to give defendants the ability to challenge them. This legislation will open the black box of forensic algorithms and establish testing standards that will safeguard our Constitutional right to a fair trial.”
Congressman Dwight Evans, D-PA, said, “Opening the secrets of these algorithms to people accused of crimes is just common sense and a matter of basic fairness and justice. People’s freedom from unjust imprisonment is at stake, and that’s far more important than any company’s claim of ‘trade secrets.’”
Across the country, law enforcement agencies are increasingly using a new type of software to partially automate the analysis and interpretation of evidence in criminal investigations and trials. These forensic algorithms have been used in thousands of criminal cases across the United States over the last decade to analyze everything from degraded DNA samples and faces in crime scene photos to gunshots and online file sharing. People are being convicted based on the results of these potentially flawed forensic algorithms without the ability to challenge this evidence due to the intellectual property interests of the software’s developers.
Only the developers know how these algorithms work. Judges consistently side with developers and defendants are being denied the ability to challenge the evidence used against them and evaluate how these algorithms work because of the developers’ trade secret protections. This presents a threat to due process rights and violates the confrontation rights guaranteed for defendants in the Constitution’s Bill of Rights, as well as in federal, local, and state law. The Justice in Forensic Algorithms Act protects due process rights by prohibiting the use of trade secrets privileges to prevent defendants from challenging the evidence used against them.
There is still much to learn about how effective and trustworthy forensic algorithms really are. A case in upstate New York, where two different probabilistic genotyping programs were used to analyze the same sample demonstrates this subjectivity, one program found a match to the suspect and one said there was no match. These inconsistencies point to the need for greater transparency and understanding of the subjective human decisions involved when this software is being built and used. To address this, the Justice in Forensic Algorithms Act directs the National Institute of Standards and Technology (NIST) to establish Computational Forensic Algorithms Standards and a Computational Forensic Algorithms Testing program that federal law enforcement must comply with when using forensic algorithms. By establishing these standards and testing programs, defendants will have access to more information when evaluating the evidence used against them during a criminal proceeding.
In May 2020, the Government Accountability Office (GAO) released part one of a comprehensive report on the use of forensic algorithms by federal law enforcement. This study was conducted at the request of Rep. Mark Takano. According to GAO, “The first phase describes algorithms being used by federal law enforcement agencies and how these technologies work.” While the second phase of the report will, “assess the approaches and challenges related to how federal law enforcement agencies apply these technologies and will identify policy options for addressing these challenges going forward.” The second installment of this report is expected to be released later this Spring.
The Justice in Forensic Algorithms Act has been endorsed by The Legal Aid Society and the Electronic Frontier Foundation.
Background on the legislation:
The Justice in Forensic Algorithms Act opens the black box of forensic algorithms by:
• Prohibiting the use of trade secrets privileges to prevent defense access to source code and other information about software used to process, analyze, and interpret evidence in criminal proceedings;
• Directing the National Institute of Standards and Technology to establish both Computational Forensic Algorithm Testing Standards and a Computational Forensic Algorithm Testing Program; and
• Requiring Federal law enforcement to comply with standards and testing requirements in their use of forensic algorithms
Dayanara Ramirez (202) 225-2305
Next Article Previous Article