You could be next on the list of fraudsters. What do you need to know?

Savneet Singh
5 min readJan 21, 2024

--

One thing that we would all agree on is that AI is going to change the world, both in positive and unexpected ways. AI has the potential to bring positive changes in medical research, tackle climate change, improve food scarcity, improve business processes across industries, and so much more. But we should also accept the fact that, with new technologies, there may be impacts on society and the environment that we may not have imagined or intended.

AI might be a tool for people with malicious intent to perpetrate fraud in people’s lives. Fraudsters may use AI to carry out more sophisticated ways of fraud and scamming in different forms and shapes. The Federal Trade Commission (FTC) report(1) reveals that fake scams in the United States have resulted in a financial loss of $8.8 billion in the year 2022.

Types of artificial intelligence (AI) related scams

GenAI related scams: Using Generative AI to create text, images, and deepfake videos: Scammers have been using the capabilities of GenAI such as GPT-4, Bard, DALL-E, AutoGen, and others to create custom emails, instant messages, images, posters, and other material as bait to hook the victims.

Scammers use GenAI to bait people in various ways:

  1. Deepfake images and AI-generated selfies: Scams around AI generated images and selfies to verify KYC, unlock banking apps, intrude on security apps, and obtain medical records have been growing exponentially.
  2. Mis-text or email-related scams: There is a common and wide scam around sending mis-addressed text messages, designed to lure potential victims and then engage those who text back. There is a very interesting story(2) written by Sean Gallagher(2)about “well-established and well-organized scam rings” that gathered over $3 million in cryptocurrency over a 5-month period.
  3. Fabricated images: Creating fabricated images of damaged vehicles, property, or individuals to support false claims for insurance and medical needs is another way to scam.
  4. Synthetic content for business and webstore-related fraud: Scammers also create several fake pages(3), social media accounts, websites, images, content, and other artifacts to create a legitimate-looking business or campaign. This makes it look genuine, and it is so much easier to receive money via online payments. Sometimes these businesses look so sophisticated and genuine that it is hard to distinguish between real and fake ones.
  5. Scam on dating sites: Using fake selfies, images, and deepfake videos on dating websites, professional websites, and e-commerce websites. There has been a big concern over deep fake porn where underage girls(4) have been victims of AI generated apps, videos, and images.

AI Chatbots: Chatbot AI scams are a lot like other types of online scams, like investment, dating, and identity theft tricks.

  1. Most of the time, the scam starts with the scammer calling the target and pretending to be someone else. Scammers may approach their targets through various means, such as liking a profile on a dating app, sending a message on social media, or posing as a fake recruiter from an investment firm, offering an irresistible opportunity.
  2. These AI chatbots can remember things and answer new questions fluently while “talking” to thousands of people at once.
  3. These chatbots can be involved in scams that nobody may have thought of before. The chatbots can adapt to deception in communication, different writing styles, and react to messages in context.
  4. It is easier for these bots to use different languages to reach out to different targets.

Voice cloning-related scams: As artificial intelligence (AI) has grown, it has become easier for thieves to copy voices(5) and make dialogue that sounds like it came from real people.

Leading news(6) websites all across the world(7) have reported several fraud cases. And these scams are both emotionally distressing and lead to financial losses.

Voice scams- How to detect voice scams and what to do if you are a victim of a voice scam

What do we do now?

The Federal Trade Commission is putting out warnings against these scams and frauds. It is very important to act now and create countermeasures to keep up with the exponentially evolving AI technologies. If we do not keep up the pace, then there is a risk that hard-fought improvements in the fraud defense could be at risk soon.

On the other side, organizations are using machine learning to detect high risk behavior, digital fraud, and payments for fraud. Big technology companies are partnering with other industries to deal with synthetic content such as fake images, videos, and voice clones so that we can distinguish between trustworthy and untrustworthy content.

It is very interesting to note that AI is helping the Australian government(8) combat scams and fraud. AI chatbots that can mimic humans in extended phone interactions with fraudulent callers. These chatbots will identify suspicious patterns and behaviors, collect information on scammers, report them to law enforcement, and protect individuals and businesses from financial losses.

Additionally, there are AI powered voice analytics tools that can identify the difference between voice clones and original ones. There are other sophisticated deduction techniques that enable the AI model to alert customers when it recognizes a pattern of communication that looks like social engineering.

The top three ways we could reduce these scams and frauds are:

  1. Vigilance: Continuous monitoring of how and what kind of frauds and scams are being reported.
  2. Preparation: We need to adapt to the evolving threats both at the organizational level and at the individual level so that we can devise the right action plan.
  3. Knowledge sharing: This means working across technology companies, financial organizations, and the government to develop safeguards and guardrails to protect people against fraudsters.

Resources

  1. The Federal Trade commission report
  2. Sour Grapes: stomping on a Cambodia-based “pig butchering” scam by Sean Gallagher
  3. The Dark Side of AI: Large-Scale Scam Campaigns Made Possible by Generative AI
  4. AI generated Deepfake images target young girls
  5. Where is my son?
  6. Mom, these bad men have me’: She believes scammers cloned her daughter’s voice in a fake kidnapping
  7. Is your valentine a chatbot? Experts urge caution amid rising AI scams
  8. Conversational AI trained to bust scammers’ business models using scam script patterns in Australia
  9. Scammers use AI to enhance their family emergency schemes
  10. AI voice Scam
  11. Conversational AI trained to bust scammers’ business models using scam script patterns in Australia

--

--

Savneet Singh

Learning Experience Architect by profession and AI Ethicist by passion