Be careful for AI-generated scams in 2024

Geekmag
3 min readFeb 6, 2024

--

As technology advances, scammers get better at using AI to trick people and steal their money. That’s why fostering a zero trust mindset can help protect you from these scams.

Today, most of us spend a lot of time watching content generated by AI tools that is not authentic . Scammers may use chatbots to generate emails from phishing , fake websites , fake posts , fake profiles , and fake consumer reviews , or to help create malware , apps, ransomwares and websites attaques par injection prompt. They may use deepfakesscams clones vocaux, extortion, and financial fraud.

Here are the top scams to be aware of in 2024:

AI-powered voice identity scam

AI allows scammers to create fake voicemails from people you know and trust. Imagine receiving a voicemail from your boss asking you to transfer money to a bank account. You follow the instructions without realizing that the voice message is generated by deepfake voice technology . And just like that, you have fallen for a scam .

Scammers also use AI-generated videos to pose as a family member, such as a grandchild, who claims to be in distress and need money.

Another fraudulent application of deepfakes is impersonating politicians to sow confusion or misinformation . This is certainly a topic to watch in 2024 , as it is an election year .

What to do: Before responding, try to verify the situation by calling or texting your family members directly.

Investment scam

Investment scams can appear as trustworthy financial organizations promising incredible returns . Be careful in such situations. Scammers often target scams related to cryptocurrencies .

Another scam to be wary of in 2024 involves fake loans that offer low interest rates but never materialize . Instead, victims are required to pay an upfront fee or provide their personal information.

Be very careful with any offer that seems too good to be true!

What to do: Be careful and check the legitimacy of the request by calling the financial organization using the official contact details provided on the institution’s website.

Identity and profile theft on social networks

Identity theft involves stealing personal information to commit fraud. Scammers do this via phishing emails or phone calls.

Social media profile theft involves hackers hijacking your social media accounts to reach your contacts, impersonate you, spread malware, scams, or demand money.

What to do: Avoid sharing personal or account information over the phone or through emails or online forms.

Scam on dating sites

Scammers can create a deepfake video face and voice matching the preferences of a victim’s dating app profile. They will attempt to engage the victim in conversation and trick them into saying something incriminating in order to blackmail them or ask them for money.

What to do: Stay aware of potential scams when interacting with someone you don’t know. Be aware that the person on the other end of a phone call or video chat may not be who they seem. Never send money to someone you’ve only spoken to via technology. If you plan to meet them in person to check in, do it in a public place.

Charity scam

After a humanitarian crisis , charity scams frequently appear. These scams play on your emotions and ask you to donate money to a good cause , but in reality, they are fake works.

What to do: You should especially watch out for unusual payment methods instead of secure payment platforms. Always make sure charities are legitimate before transferring money and donating.

Tech support scam

Posing as computer technicians, scammers try to convince you that there is a problem with your computer and offer to repair it. They then take control of your computer and steal your personal information or hack your bank account.

What to do: In this case it is recommended to hang up immediately and contact your IT support center directly.

AI imitates your voice

Although social media engagement through recorded videos and participation in podcasts is encouraged, it is often difficult to know who is listening on the other end.

Scammers only need three seconds of your voice to imitate it !

https://geek-mag.net/attention-a-ses-arnaques-crees-par-lia-en-2024/

--

--