Deepfake Videos Could Be Exposed

John Lister's picture

Intel claims it can spot 96 percent of "deepfake" videos. The trick appears to be tracking blood flow in the face.

A deepfake is a more sophisticated version of the comparatively basic task of replacing one person's face with another in a video. Common uses include making the person appear to be saying something they never actually said (creating political distrust) or making it look like they performed lewd actions.

The "deep" element comes from deep learning, where a computer program tries out and develops different tactics for performing a task better. In this case, the task is making the video more convincing, particularly in syncing the mouth movements in the video to the audio.

Real-Time Analysis

The Metro notes deepfakes could even become international weapons. At one stage Ukraine feared Russia would create a deepfake that appeared to show Ukraine's President Volodymr Zelensky announcing a full surrender. (Source: metro.co.uk)

Even some of the best quality deepfakes can eventually be exposed by either human or computer analysis, but often this takes several hours. By that time, the video may have spread virally and achieved the creator's initial intentions.

Intel says it wanted to find a way to assess potential deepfake videos in real time, making it possible to immediately detect fakes and either block or label them.

It says many attempts to create a detection system have concentrated on analyzing fakes to spot patterns of tell-tale signs. Instead Intel developed its system by analyzing real videos to find patterns that indicate genuine footage.

Blood Unlocks Secrets

It believes the key is changes in the color of veins across the face as the heart pumps. It appears that in a fake video, the need to manipulate the facial image to fit the supposed speech means the visual pattern of blood flow signals is disrupted.

Intel claims it can now detect deepfake videos in milliseconds with a "96 percent accuracy rate." The company suggests social media companies could use this to vet uploads, while media outlets could use it to check videos before using them in reporting. It also hints at letting nonprofit organizations make the technology more widely available. (Source: intel.com)

The downside of course is that the Intel announcement tips off the fakers as to how they can make the videos more convincing, but it may at least be a short-term victory in the game of cat and mouse.

What's Your Opinion?

Do you think you could spot a deepfake video? Should it be a specific crime to create a deepfake video of an individual? Is it worth investing in technology to spot fakes?

Rate this article: 
Average: 5 (6 votes)