Need to Spot a Real-Time Deepfake? Tell the Person to Turn Sideways

PCMag
PC Magazine
Published in
3 min readAug 10, 2022
(Credit: Metaphysic)

Real-time deepfakes also run into trouble when the user puts a hand over their face.

By Michael Kan

The potential for deepfakes to scam users during video calls is rising. But one company is pointing out that the AI-powered technology has an easy-to-spot flaw: It struggles to render fake faces at sideway angles.

The findings come from Metaphysic, an AI content-generating company that recently examined some of the limitations of real-time deepfakes. Through a series of tests, the company’s report shows the deepfake technology can faithfully render a celebrity’s face over someone else’s face during a video call—but only when they’re facing forward.

(Credit: Metaphysic)

The fakery immediately collapses once the user turns their face at a 90-degree angle. The technology will also run into trouble if the user places a hand over their face.

Metaphysic released the report weeks after the FBI warned that fraudsters have been exploiting deepfake technology to impersonate job candidates during interviews. The scammers have been doing so while applying for remote jobs that could’ve given them access to financial and confidential corporate data.

Metaphysic offers an easy way for job interviewers to spot a real-time deepfake during a video call. The company ran the demo by using DeepFaceLab, the free software behind many popular deepfake videos circulating on YouTube. The software also has a real-time version called DeepFaceLive, which can swap a celebrity’s face for your own.

Although the technology can pull off the deepfakery with impressive results, the software wasn’t designed to run the real-time face-swapping at acute angles. For example, the facial-mapping processes will accidentally generate an additional eye or eyebrow for faces that appear sideways.

That said, it’s still possible to faithfully expand the deepfake technologies to render faces at 90-degree angles, as the above video demonstrates. But even so, a deepfake can be achieved only when you have enough facial data on the subject you’re trying to impersonate. Metaphysic says that outside of celebrities, most people don’t have sideways profile photos of themselves circulating on the internet.

“Unless you’ve been arrested at some point, it’s likely that you don’t have even one such image, either on social media or in an offline collection,” the company added.

Nevertheless, it may only be a matter of time before deepfake technologies overcome these flaws by simulating missing facial data. For example, Nvidia’s own Neural Radiance Fields (NeRF) technology can turn 2D images into 3D scenes by using only a collection of still photos. As a result, the security industry may face a cat-and-mouse game with trying to detect real-time deepfakes in the future.

Originally published at https://www.pcmag.com.

--

--