geometrical border

Will AI Technology Undermine the Faith and Integrity of Our Judicial System?

October 5, 2023

This article was first published in the Detroit Legal News on October 5, 2023.

A. Vince Colella
Moss & Colella P.C.

Audio and video have always proved to be reliable evidence for lawyers and prosecutors presenting their cases. For example, surveillance footage of a robbery in progress, clearly showing a person armed with a gun walking into a store, robbing a clerk and escaping would normally be an open and shut case for a prosecutor. However, advancements in AI technology have now made it possible to alter sounds and images in a manner that is nearly undetectable. The manufacturing or altering of evidence is not something entirely new. However, the ability to distort reality has taken an exponential leap forward with “deep fake” technology. Lightning advancements in artificial intelligence have not only created an ability to alter images, but to create videos of actual people doing and saying things that never occurred. Machine learning has made these created images much more realistic and nearly incapable of detection. An evidentiary nightmare for the court system.

Fake video depictions of real people began to emerge on the internet in late 2017. Surprisingly, the technology did not require elaborate Hollywood cameras and editing equipment. The new technology allows anyone with a smart phone to mimic the movements and words onto someone else’s face and voice to make them appear to say or do anything. And the more video that is fed into the dep-learning algorithms, the more convincing the result. The danger of deepfake technology is two-fold. First, it may be used, as in the example above, to demonstrate the commission of an act or statement attributable to a person that did not take place. Second, it opens the door to deepfake bias that can be used to delegitimize actual audio and video evidence. Recently, Tesla was sued by a family of a man who died when his car crashed while using the self-driving feature. During the trial, the family’s lawyers cited a statement made by Tesla founder, Elon Musk, in 2016 claiming that its Model S and Model X vehicles were capable of being driven autonomously with greater safety than a person. While the statement was in fact uttered by Musk at a conference, lawyers for the car company suggested that Musk was the subject of several fake videos saying and doing things that he had not said or done — casting doubt on whether his statements about the safety of his vehicles were true.

The capacity for creating undetectable videos of everyday people has created a shroud of ‘doubt’ over what we have come to accept as reliable. Thus, the ability to manufacture images and sound undoubtedly may cause jurors to question otherwise reliable evidence. A double edge sword of evidentiary deceit.

This has opened the floor to debate among legal scholars on how to remedy deepfakes and the bias it creates. The challenges include (1) proving whether audiovisual evidence is genuine or fake; (2) confronting claims that genuine evidence is a deepfake; and (3) addressing a growing distrust and doubt among jurors in audiovisual evidence. From an authenticity standpoint, we might see a sharp rise in the use of AI experts to confront and present evidence.

Analysis of metadata and source information will likely be used to prove the veracity of an image. Experts will also be required to weigh in on unusual or unnatural elements within an image. For example, if an image has perfect symmetry of flawless patterns typical in AI-generated images, experts can be called to testify to these unauthentic features. However, deep scientific dives into reliable video evidence will not only prove costly to the litigants but also be disruptive to the efficiency of our judicial system.

Complex legal issues caused by the evolution of science and technology are often solved with the basic tenets of jurisprudence. Historically, solutions to problems surrounding the presentation of evidence in legal proceedings were governed by existing, non-exhaustive means of authentication in the state and federal rules of evidence. While authenticity can be proved in several ways, lawyers primarily rely on witnesses to confirm that what we see and hear in audio-visual reproduction exists in real life. However, because artificial intelligence is so difficult to detect, forensic and scientific examination will likely be the best way to ferret out reliable evidence from the deepfakes.

The importance of maintaining judicial integrity cannot be overstated. Therefore, forensically keeping pace with artificial intelligence is paramount to the fair administration of justice and to preserve our country’s faith in the system.
Lawyers and judges must stay mindful of the potential for fraudulent audio-visual evidence and to ensure that jurors are not duped into thinking something is suspicious when it is not.


Vince Colella is a founding partner of Southfield-based personal injury and civil rights law firm Moss & Colella.

No items found.

Free Case Evaluation

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

About the Firm

The attorneys of Moss & Colella have carefully chosen their career paths to fight for those that have suffered from injury and wrongful death. We believe every citizen should have the right to hire the best lawyer who will actively advocate for their case.

David Moss and Vince Colella have over 60 collective years of personal injury trial experience that provides you a level of legal services and success unmatched by other firms. By working together, we have the ability to find creative, effective, and efficient solutions to even the most complex cases. No matter what situation you face, we will help you get through it.

CTA Section

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.

Contact Us

Contact Us

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.