Latest News

Fake Spotting: The Challenge of Authenticating Photos in a Generative AI World

October 2023 • Source: Lars Daniel, Envista Forensics, EnCE, CCO, CCPA, CIPTS, CWA, CTNS, CTA

Imagine having the ability to create brand-new content with just a few clicks of a button. This is the power of generative artificial intelligence (AI), a cutting-edge technology that can generate text, images, and videos based on existing data. While still in development, the potential of generative AI to revolutionize industries, such as marketing, entertainment, and product development, is truly incredible. With generative AI, the possibilities are endless.

The potential for generative AI to positively impact the world cannot be understated. However, it is crucial to be aware of the risks of misuse. In this article, I hone in on the challenges related to faked photos generated through artificial intelligence.

Common Procedures Before the AI Era

Before generative AI, video and photo forensics experts used various methods to determine if a photo was fake. Some of the most common procedures included:

Analyzing Metadata

A photo’s metadata can contain information about the camera used to take the picture, as well as the date and time the photo was taken. Forensic experts can use this information to identify inconsistencies that may indicate a photo is fake. For example, suppose the metadata indicates that the photo was taken with a camera that was not yet available at the time that the photo is purported to have been taken. In that case, this is a clear sign of forgery.

Analyzing Lighting and Shadows

Forensic experts can look for inconsistencies in the lighting and shadows in a photo to identify signs of manipulation. If a shadow is going in the wrong direction, or if two objects are casting shadows in different directions, this could be a sign that the photo has been edited. Forensic experts use various tools and techniques to analyze the lighting and shadows in a photo, such as measuring the angles of shadows and comparing the brightness of different areas of the image.

Analyzing Textures

Forensic experts can also look at the textures in a photo to identify signs of manipulation. If someone's skin looks too smooth or plastic-like, this could be a sign that the image has been edited. Forensic experts can examine the photo's individual pixels and compare the textures of different objects in the photo.

Analyzing Reflections

Reflections in a photo can help forensic experts identify signs of manipulation. For example, if a reflection in a mirror differs from the object being reflected, this could be a sign that the photo has been edited. Forensic experts can use various tools and techniques to analyze the reflections in a photo, such as measuring the angles of reflections and comparing the brightness of different areas of the photo.

Specialized Video Forensic Software

There are specialized video forensic software programs that can be used to analyze photos for signs of forgery. These programs can look for inconsistencies in the lighting, shadows, textures, reflections, and other signs of forgery. For example, some software programs can be used to detect the presence of cloning, airbrushing, and liquefying.

While these methods are still relevant and useful for uncovering fakes created by generative AI, the technical challenges and expertise required to spot fakes have increased substantially. Even the most experienced video forensic examiners are challenged by fake photos created using generative AI. As generative AI technology develops, distinguishing between real and fake photos will become even more challenging. 

Stand Out Signs of a Faked Photo

As of this writing, generative AI has challenges in creating photos that can fool an experienced forensic examiner. There are various signs of a faked photo that an examiner would review, including:

Inconsistencies in Lighting and Shadows 

Generative AI models sometimes have difficulty creating realistic lighting and shadows. For example, a fake photo may have shadows that go in the wrong direction or that are too dark or too light.

Inconsistencies in Textures 

Generative AI models can also have difficulty creating realistic textures. For example, a fake photo may have skin that looks too smooth or plastic or hair that looks too perfect.

Inconsistencies in Reflections 

Generative AI models can have difficulty creating realistic reflections. For example, a fake photo may have a reflection in a mirror that is different from the object being reflected.

Examination Using Specialized Video and Photo Forensics Software

Fortunately, specialized video and photo forensics software in the hands of a qualified photo and video forensic expert is powerful and increasing in capability as the challenge and need to authenticate photo evidence rises daily. For example, an examiner armed with these tools can perform the following examinations:

File Analysis

By using databases of known images, the original unaltered image can potentially be located to see if it originated from a social media platform before being used to create a fake. This type of analysis can also uncover the originating device, such as the camera used to take it in some instances. 

Compression and Reconstruction 

With forensic software, an examiner can uncover if a photo has multiple compression ratios in the same image and if the original compression used differs from the photo in review. Both would indicate potential tampering, for example, if more than one photo were collaged in creating the fake. This analysis can also uncover artifacts related to resizing, color processing, rotation, or other modifications to an image.

Camera Identification

If the fake is made from a photo taken with a digital camera, it is possible to link it to the camera by the visual artifacts it creates. These artifacts are often undetectable to the human eye but can be used to link the tampered photo back to the device that took the picture. For example, if someone claimed they did not take the photo, but their camera can be positively identified as the device that took the photo by comparing the tampered photo and exemplar photos from the camera, their assertion would be proven false. 

Geometric Analysis

One of the most challenging parts of forging an image is keeping the lighting, shadows, and perspective consistent with what would be captured by a camera in reality. Forensic software can be used to analyze the visual scene captured by the photo to determine if the shadows are cast realistically if the highlighting makes sense on an object or person given the location of a light source, or if the angle by which the photo was taken makes sense compared to a realistic perspective.  

Suggestions for Attorneys and Claims Professionals

While their jobs are challenging enough, it is unfortunately true that attorneys and claims adjusters need to be more vigilant than ever before. A faked photo could be a screenshot of text messages containing a damning conversation or an alleged injury or assault. Complicating this issue is that the tools used to create generative AI photos are available to everyone and require a low level of technical sophistication to employ.

In general, it is wise to maintain a posture of incredulity concerning photos submitted as evidence. Here are some suggestions for attorneys and claims professionals: 

  • Be skeptical of photos submitted as evidence, especially if the original device the photos were allegedly taken on is gone and cannot be used as a source of verification.  
  • Request the device that took the photos, not just the photos themselves. If you find a photo does warrant examination by a photo and video forensic expert, having the device the photo was allegedly taken with aids in the examination process. 
  • When looking at the photo, even if you cannot spot anything in particular but the image feels off, it might be worthwhile to have it examined.  

While generative AI creates new challenges, the sophistication of forensic methods and tools to examine photos has also. As a community, the legal and insurance world has dealt with forged documents, manually faked photos, and other forms of misinformation. Knowing is the first step in preventing or remediating the impacts of faked photos and that starts by being aware that one showing up in your case or claim is a real and distinct possibility. As they say, a picture is worth a thousand words. At least it used to be.  


Perrier & Lacoste Hires Two New Attorneys

September 2023 • Source: Perrier & Lacoste

Perrier & Lacoste is pleased to announce that Kalleigh A. McCoy and Brian A. Gilbert have joined their firm as P&L's newest attorneys. Kalleigh has experience in mass tort litigation, risk assessment, contracts, and federal reporting. Brian has experience in construction litigation, commercial litigation, insurance disputes, personal liability defense, and commercial casualty defense. P&L is thrilled to have them as part of their team and asks that you join them in welcoming Brian and Kalleigh to the firm.


Lederer Weston Craig Trial Results

December 2023 • Source: Lederer Weston Craig

Lederer Weston Craig attorneys Kent Gummert and Alexandra Galbraith Davis tried a legal malpractice claim in Polk County, Iowa in November 2023.  LWC defended its client after the client missed the statute of limitations to pursue an underinsured motorist claim. The client admitted liability; the only issue at trial was damages. The Plaintiffs asked for $275,000 at trial, and the jury rendered a verdict of $29, 983.60.  After offsets from underlying settlements, the likely judgment will be $2,000.


Gary Lovell Inducted Into the Litigation Counsel of America

November 2023 • Source: Copeland, Stair, Valz & Lovell, LLP

Copeland, Stair, Valz & Lovell, LLP proudly announces that Gary Lovell has been inducted into membership as a fellow in the Litigation Counsel of America. The Litigation Counsel of America is an invitation only Lawyer Honorary Society composed of less than one-half of one percent of American lawyers. Fellows are selected based upon effectiveness and accomplishment in litigation, both at the trial and appellate level, and superior ethical reputation. Gary’s 36+ years of practice and 100+ jury trials of catastrophic cases show his dedication to the art and science of litigation, as reflected in his invitation to membership in the Litigation Counsel of America and the American Board of Trial Advocates. 


Dan Folluo of Rhodes, Hieronymus, Jones, Tucker & Gable was inducted as a Fellow of the International Academy of Trial Lawyers (IATL) during its 2023 annual meeting

October 2023 • Source: Rhodes, Hieronymus, Jones, Tucker & Gable

The International Academy of Trial Lawyers limits membership to 500 Fellows from the United States in addition to Fellows from nearly 40 countries across the globe. IATL seeks out, identifies, acknowledges, and honors those who have achieved a career of excellence through demonstrated skill and ability in jury trials, trials before the court, and appellate practice. Members are engaged in civil practice on both the plaintiff’s and the defendant’s side of the courtroom, and the trial of criminal cases. The Academy invites only lawyers who have attained the highest level of advocacy. A comprehensive screening process identifies the most distinguished members of the trial bar by means of both peer and judicial review.

Chartered in 1954, the Academy’s general purposes are to cultivate the science of jurisprudence, promote reforms in the law, facilitate the Administration of Justice, and elevate the standards of integrity, honor, and courtesy in the legal profession

<< first < Prev 1 2 3 4 5 6 7 8 9 10 Next > last >>

Page 2 of 33