AI-Powered Scams: The New Threat

NEFTURE SECURITY I Blockchain Security
Dissecting Web3
Published in
5 min readNov 27, 2023

AI tools have introduced an alarming escalation of social engineering threats!

Social engineering is a form of manipulation or deception used by individuals or groups to exploit human psychology and behavior in order to gain unauthorized access to information, systems, or resources.

They can take various forms.

Probably the most well-known of them are romance scams.

But social engineering was also behind CoinsPaid’s $37.3 million hack launched by North Korea State-sponsored hack group Lazarus, as well as almost cost $125 million to a single person in a “business scam,” last year.

Social engineered scams are part of the most damaging scams that exist, for both people and companies.

AI has opened up new, alarming opportunities for social engineering to cause even greater damage, and at an even higher rate.

Here’s how.

Raising the Scale of Social Engineering

The majority of social engineering is time-consuming and labor-intensive for scammers.

AI allows fraudsters to delegate either a portion or almost the entirety of entrapping people in the web of lies.

Their tools? Autonomous agents, sophisticated scripting build on their own experiences, or derived from the experiences of other scammers when part of a criminal organization, as well as other classic automation tools.

For instance, they can automate every step of the process, from selecting their targets to delivering phishing emails, and even providing human-like responses in chat boxes or phone calls.

What’s more, beyond the scammers’ own experience, AI can continuously improve its tactics based on its ongoing learning

Selecting between what proves effective and what doesn’t, thereby evolving its own clever phishing strategies to navigate the most effective path forward in scamming unsuspecting people.

AI offers fraudsters the capability to conduct succesfully highly targeted social engineering attacks on a massive scale.

Sophisticating The Unsosphiticated

AI is a formidable tool in aiding scammers to become much more efficient and convincing in their attacks, thereby increasing the success rate of even the most unsophisticated scams.

Most people, okay, everyone, has received a very basic phishing email in their lives. The seedy nature of such emails can be easily spotted through spelling or grammar mistakes, very poor wording, or overused scamming tropes, like the classic ‘Nigerian Prince’ wanting to share his diamonds with you!

AI tools help non-native speaker fraudsters not only draft well-written emails but can even come up with new scenarios and sophisticated phishing emails.

On the darknet, AI tool dedicated to phishing scams can even be found like WormGPT which helps scammers pull nefarious e-mail campaigns.

Source: SlashNext

AI tools increase the success rate of scams and help them become even more sophisticated by even adding extra steps at no cost for scammers who can automate days of flawless, coherent and relevant conversation.

Richard Ma, co-founder of Quantstamp, a Web3 security firm, shared with Cointelegraph an AI-powered attack anecdote: a scammer pretended to be the chief technology officer (CTO) of their client’s firm and messaged engineers in the company for days with basic exchanges and engaging them in a multitude of conversations. Laying the ground to build the trust between them so that they will be ripped for the picking once the scammer layed the trap.

Sophisticating The Sophisticated

Probably the most impressive part of AI is its ability to create fake humans.

Artificial intelligence possesses the capability to craft convincing deepfakes, which encompass synthetic videos and counterfeit virtual personas, displaying a striking semblance to reality.

Deceivers can emulate an authentic individual, like a high-ranking executive, and effectively strike conversations with their targets while employing social engineering techniques to coax them into divulging confidential data or execute financial transactions.

More, AI is also adept at replicating human speech and audio, facilitating the execution of sophisticated voice phishing, often referred to as “vishing” assaults.

The deployment of AI-driven voice cloning technology has repeatedly demonstrated its proficiency in assuming the identities of family members, successfully coercing unsuspecting victims into transferring funds under the guise of a family crisis.

As it was the case in this AI kidnapping scam that copied a teen girl’s voice in a $1M extortion attempt.

This also represents an incredible opportunity in multi-billion dollar industry like pig-butchering scams where usually people are hired or enslaved to produce audio and videos to make their victims believe the person they are engaged in a romantic or platonic relationship with is real!

Or to attract people to a shady company, like it has been the case with Maxpread Technologies which created a fake CEO using an avatar was on Synthesia.io dubbed “Gary.”

Source: Forbes

This salt-and-pepper fake CEO was used to promote a MLM promising “incredible profits for investors” while trying to pass off as cutting-edge firm “using AI to trade crypto assets.” They were subsequently served with a cease-and-desist order by a California financial regulator for the sale of unauthorized securities.

Full story here:

Additionally, classic AI tools can also been succesfully weaponized using “Indirect Prompt Injection.”

Cybersecurity researchers, in their own words, have achieved the feat of “Transforming Bing Chat into a Data Pirate.” They accomplished this by manipulating a Bing chatbot to masquerade as a Microsoft employee, subsequently generating phishing messages aimed at soliciting users’ credit card information.

A very easy trick made possible by users allowing the Bing Chat to see currently open websites.

The researcher reported that a potential victim doesn’t even need to start the interaction with Bing chat for a scammer to be able to execute his phishing attack.

Additionally, an attacker can “plant an injection in a website the user is visiting, which silently turns Bing Chat into a Social Engineer who seeks out and exfiltrates personal information.”

There’s no question that AI has opened a brand new era for fraudsters.

Scammers are now armed with AI tools that can help them execute highly sophisticated attacks on an industrial scale, making the Web2 and Web3 spaces even more challenging, hostile, and filled with traps for their users to navigate.

About us

Nefture is a Blockchain Security Company that secures crypto transactions!

With Nefture Security, within ✨seconds ✨ you can know if your wallet has been compromised and get your wallet security audit for free.

Check if your wallet is compromised nowhttps://www.app.nefture.com/

--

--