Tips for finding deepfakes in evidence

Tips for finding deepfakes in evidence

Deepfakes use "deep learning," a complex machine learning artwork, to create fake images, movies and audio files. If you haven't seen this creepy deepfake video that transforms Invoice Hader's face while imitating Tom Cruise and Seth Rogen, check it out.

Many who create deepfakes do so just for fun, but manipulated movies and audio have found their way into lawsuits. Additionally, how do we prevent forgeries from being admitted into evidence?

Below are some things to look for when reviewing audio and video evidence that seems too intestine (or bad) to be true.

Inconsistent lighting

Pay special attention to lighting and shadows in movies. Is the shadow of the Individual where you would expect it to be based on the light source? If the shadow or light source sometimes moves in a way that does not make sense?

Unusual eye/body movements

Computer programs have a hard time mimicking natural blinking and eye movements, so you may notice an Individual in a Deepfake video seemingly staring without blinking or their eyes not following the Individual they are talking to.

When an individual turns her head or body, watch for distortion or choppy video quality. If an Individual's head has been placed on another Individual's body, you may notice an awkward posture or body shape.

Unnatural facial features

That's a bit odd: pay close attention to the noses. With bad deepfakes, you may be able to easily tell that the individual's mouth doesn't match the words she is saying. But a more subtle sign is when an individual's nose points in a slightly different direction than the relaxation of their face.

Here's where the experts come in. Besides discrepancies you may see or hear, background data attached to a digital file can show if it has been tampered with.

For example, when you load an audio file into an editing program like Audacity, the metadata of the recording looks different from the raw file recorded on your phone. These differences may even indicate what software program was used. Lawyers in A custody case 2019 in the UK were able to prove that a damn audio piece was fake by looking at the metadata of the recording.

Digital forensics experts can examine the data behind those audio and video files to help you determine what's real.

Leave a Reply

Your email address will not be published.