Artificial Intelligence helps detect deepfake videos

Researchers tested the tool with an AI-based neural network on videos of former U.S. President Barack Obama. The neural network spotted over 90% of lip syncs involving Obama himself.

October 18, 2020 12:59 pm | Updated 01:07 pm IST

The team built a tool to look for ‘visemes’ or mouth formations, and ‘phenomes’ or phonetic sounds.

The team built a tool to look for ‘visemes’ or mouth formations, and ‘phenomes’ or phonetic sounds.

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Researchers at Stanford University and UC Berkeley have devised a programme that uses artificial intelligence (AI) to detect deepfake videos.

The programme is said to spot 80% fakes by recognising minute mismatches in the sounds people make and the shape of their mouths, according to the study titled ’Detecting Deep-Fake Videos from Phenome-Viseme Mismatches .

Deepfake videos can be made using face-swapping or lip sync technologies. Face swap videos can be convincing yet crude, leaving digital or visual artifacts that a computer can detect.

Lip syncing is subtle, and harder to spot. The technology manipulates a smaller part of the image, and then synthesises lip movements that closely match the way a person’s mouth would move if they had said particular words. With enough samples of a person’s image and voice, a deepfake video-maker can manipulate an image to “say” anything, the team said.

The team built a tool to look for ‘visemes’ or mouth formations, and ‘phenomes’ or phonetic sounds.

Watch | Have you been swapping?

It tested the tool with an AI-based neural network on videos of former U.S. President Barack Obama. The neural network spotted over 90% of lip syncs involving Obama himself.

Although the program may help detect visual anomalies, deepfake detection is a cat-and-mouse game. As deepfake techniques improve, fewer clues will be left behind, the team said.

Deepfake could also lead to a spike in misinformation which will be much harder to spot.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.