FBI: Scammers Are Interviewing for Remote Jobs Using Deepfake Tech

PCMag
PC Magazine
Published in
2 min readJul 1, 2022
(Facebook/Meta)

The scammers may be trying to score jobs at IT companies in order to access customer or financial data, corporate IT databases and/or proprietary information, the FBI says.

By Michael Kan

Scammers have been exploiting deepfake technology to impersonate job candidates during interviews for remote positions, according to the FBI.

The agency has recently seen an increase in the number of complaints about the scam, the FBI said in a public advisory on June 30. Fraudsters have been using both deepfakes and personal identifying information stolen from victims to dupe employers into hiring them for remote jobs.

Deepfakes involve using AI-powered programs to create realistic but phony media of a person. In the video realm, the technology can be used to swap in a celebrity’s face onto someone else’s body. On the audio front, the programs can clone a person’s voice, which can then be manipulated to say whatever you’d like.

The technology is already being used in YouTube videos to entertaining effect. However, the FBI’s advisory shows deepfakes are also fueling identity theft schemes. “Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the FBI says.

The scammers have been using the technology to apply for remote or work-from-home jobs from IT companies. The FBI didn’t clearly state what the scammers’ end goal. But the agency noted, “some reported positions include access to customer PII (personal identifying information), financial data, corporate IT databases and/or proprietary information.”

Such info could help scammers steal valuable details from companies and commit other identity fraud schemes. But in some good news, the FBI says there’s a way employers can detect the deepfakery. To secure the jobs, scammers have been participating in video interviews with prospective employers. However, the FBI noted that the AI-based technology can still show flaws when the scammer is speaking.

“The actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the agency said. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

Originally published at https://www.pcmag.com.

--

--