View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Cybersecurity
June 29, 2022updated 30 Jun 2022 6:18am

Criminals are using deepfakes to apply for remote IT jobs, FBI warns

Scammers are using deepfakes in job interviews to trick employers into giving them access to IT systems.

By Claudia Glover

Cybercriminals have been using deepfakes and personally identifiable identification (PII) found on the dark web to apply for remote working IT jobs to gain access to systems, passwords and sensitive information, the FBI has warned.

Stolen images found on the dark web can be used to map faces and create deepfake videos. (Image by Edward Webb / CC-BY-SA 3.0 Unported)

An alert from the FBI issued yesterday details a rise in complaints filed with the bureau reporting the use of deepfakes and stolen PII to apply for a variety of remote positions.

Deepfakes are false computer-generated audio or visual representations of a real person, which are increasingly being used in scams.

The roles most often targeted in this new scam are IT and computer programming positions. This is because if the fraud is successful they enable criminals to “access to customer PII, financial data, corporate IT databases and/or proprietary information,” the alert said.

How to spot deepfakes

Criminals are using stolen images and credentials found on the dark web to build their deepfakes. They can then use the fraudulent video or audio to impersonate a real candidate in a virtual interview. The FBI says there are some tell-tale signs in the bogus interviews that indicate the interviewee may not be real.

“The actions and lip movement seen when interviewed on camera do not completely coordinate with the audio of the person speaking,” the advisory notes. “At times, actions such as coughing, sneezing or other auditory actions are not aligned with what is presented visually,” it continues.

Pre-employment background checks have also flagged the fact that the information being used to apply for the job actually belongs to another individual. It is vital, therefore, to report stolen PII as soon as it happens as it is easy to replicate an identity with leaked information. According to the UK Information Commissioner’s Office (ICO), “your name, address and date of birth provide enough information to create another ‘you’”.

Deepfake scams are on the rise

The use of deepfakes in cyber scams is on the rise. Last month a scam was spread on YouTube that used Tesla CEO Elon Musk’s face to swindle people out of cryptocurrencies Bitcoin and Ethereum. The online criminals were hijacking accounts and channels, altering the layout to imitate official Tesla channels and posting deepfake videos of Musk inviting viewers to participate in bogus cryptocurrency giveaways. According to the BBC the scammers made $243,000 in just over a week. YouTube has been criticised for not removing fakes quickly enough.

Content from our partners
How the retail sector can take firm steps to counter cyberattacks
How to combat the rise in cyberattacks
Why email is still the number one threat vector

Some apps only require one picture to create a passable deepfake of a person. They are similar to novelty apps like Reface or Wombo and require a minimum of input data to map one face onto another. The use of this sort of app is being monitored, but the technology is in its infancy.

Speaking to Tech Monitor last year, deepfake expert Henry Adjer said: “Deepfake detection has a role, but it’s an incredibly challenging technology to get right. I think we do need to have a little bit more nuance when thinking about other approaches that we could be engaging in to tackle the problem.”

Read more: Listen carefully - the growing threat of audio deepfake scams

Topics in this article:
Websites in our network
NEWSLETTER Sign up Tick the boxes of the newsletters you would like to receive. Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
I consent to New Statesman Media Group collecting my details provided via this form in accordance with the Privacy Policy
SUBSCRIBED

THANK YOU