The Deepfake Scam That Is Blowing Everyone’s Mind

In a video meeting, a Hong Kong multinational employee was instructed by her boss to transfer HK$200 million (US$25 million) to various bank accounts in Hong Kong. It turns out that the person she saw in the meeting was not her boss, but rather her boss’s deepfake.

Sumit Kumar
SCAM UNIVERSITY
4 min readFeb 11, 2024

--

A video call meeting with a deepfake boss resulted in a $25 million scam (source : AI image by author)

Imagine being in a video call meeting with your boss and following his instructions, only to discover later that the person on the call was not your boss— it was your boss’s face digitally morphed onto someone else’s body, and even the voice you heard was an AI-generated replica of your boss’s voice.

It may sound too far-fetched, but it’s true. Scammers are now using deepfake technology to carry out convincing impersonations to achieve their malicious goals.

In the following account, we’ll explore a troubling incident that targeted an employee of a multinational corporation in Hong Kong. This story highlights the importance of remaining vigilant against the emerging misuse of deepfake technology.

The Deceptive Meeting

The employee of a multinational corporation based in Hong Kong received an email purportedly from the company’s chief financial officer, who was in the UK. The email instructed the employee to transfer funds up to HK$200 million or US$25 million to various bank accounts in Hong Kong. Not falling for the email, the employee was subsequently invited to participate in a video meeting.

During the meeting, the employee observed what appeared to be the company’s chief financial officer, accompanied by other familiar faces. The setting of the meeting room also seemed authentic, further reinforcing the employee’s confidence in the legitimacy of the situation. Accordingly, the employee complied with the instructions provided during the meeting and transferred HK$200 million (or US$25 million) to various bank accounts in Hong Kong.

However, following the completion of the transaction, the employee began to feel uneasy about the circumstances surrounding the transfer.

Suspecting foul play, the employee contacted the company’s head office, only to discover that the entire meeting had been a meticulously planned scam.

The Anatomy of the Deepfake Scam

Everything seemed legitimate. The employee saw familiar faces, heard familiar voices, and even recognized the meeting room. Yet, unbeknownst to her, the person she saw in the meeting was not the chief financial officer but rather a meticulously crafted deepfake — a digital clone created with the aid of artificial intelligence.

Behind the scenes, the perpetrators leveraged publicly available data — images, video clips, voice samples, and even details of the company’s office — to construct a convincing deception. With careful manipulation, they orchestrated a video call that seemed authentic in every aspect, leaving the unsuspecting employee with no reason to doubt its legitimacy.

The Threat of Deepfake Technology

Deepfakes blur the line between truth and fiction, making it increasingly difficult to discern fact from fabrication. From impersonating celebrities to defrauding individuals and organizations, and even blackmailing, the potential for misuse is vast.

Recently, Twitter was rocked by the appearance of a sexually provocative deepfake featuring a well-known Indian actress. People believed it was her leaked video clip from an upcoming movie. But it was only after the emergence of the authentic video clip that it became evident that the actress’s video clip, circulating on Twitter, was a deepfake.

With people revealing more and more about themselves online, the information required to create a convincing deepfake is not hard to procure.

Some apps can create an ultra-realistic nude version of an original image within seconds. The use of such images to sextort and bully victims is on the rise.

The emergence of deepfake technology poses a serious threat to individuals and society. The ability to create ultra-realistic fake content, be it videos, images, or audio recordings, from easily accessible deepfake creation tools, highlights the ease with which individuals can be deceived and manipulated by this technology.

How To Protect Yourself

If you receive a video message of a loved one in distress asking you to transfer money to an account or give it to someone who will collect it from you, it’s essential to exercise caution.

Before taking any action, always try to contact the person in the video message. If immediate contact is not possible, be patient — most scams occur because people panic and act in a rush.

Creating an emergency codeword with close family and friends is recommended. This way, if you receive a suspicious video message, text message, or call, you can use the codeword to confirm whether the request is genuine.

Most importantly, be skeptical of what you see and hear.

Conclusion:

As technology continues to evolve, so do the tactics employed by cyber criminals. The rise of deepfake scams is a stark reminder of the importance of vigilance and skepticism in an increasingly digital world. By staying informed and implementing security measures, we can better protect ourselves and our organizations from the perils of deception.

--

--

Sumit Kumar
SCAM UNIVERSITY

Reader, Thinker, Philosopher, Content Creator, Researcher and wannabe Writer. I am a noob of everything and expert of somethings.