Inscrivez-vous maintenant pour un meilleur devis personnalisé!

Scammers are using AI to impersonate your loved ones. Here's what to watch out for

Mar, 09, 2023 Hi-network.com
Kilito Chan/Getty Images

Imagine getting a phone call that your loved one is in distress. In that moment, your instinct would most likely be to do anything to help them get out of danger's way, including wiring money. 

Scammers are aware of this Achilles' heel and are now using AI to exploit it. 

Also: These experts are racing to protect AI from hackers. Time is running out 

A report from The Washington Post featured an elderly couple, Ruth and Greg Card, who fell victim to an impersonation phone call scam. 

Ruth, 73, got a phone call from a person she thought was her grandson. He told her she was in jail, with no wallet or cell phone, and needed cash fast. Like any other concerned grandparent would, Ruth and her husband (75) rushed to the bank to get the money. 

It was only after going to the second bank that the bank manager warned them that they had seen a similar case before that ended up being a scam -- and this one was likely a scam, too. 

This scam isn't an isolated incident. The report indicates that in 2022, impostor scams were the second most popular racket in America, with over 36,000 people falling victim to calls impersonating their friends and family. Of those scams, 5,100 of them happened over the phone, robbing over$11 million from people, according to FTC officials. 

Also: The best AI chatbots: ChatGPT and other alternatives to try

Generative AI has been making quite a buzz lately because of the increasing popularity of generative AI programs, such as OpenAI's ChatGPT and DALL-E. These programs have been mostly associated with their advanced capabilities that can increase productivity amongst users. 

However, the same techniques that are used to train those helpful language models can be used to train more harmful programs, such as AI voice generators. 

These programs analyze a person's voice for different patterns that make up the person's unique sound, such as pitch and accent, to then recreate it. Many of these tools work within seconds and can produce a sound that is virtually indistinguishable from the original source.  

What you can do to protect yourself

So what can you do to prevent yourself from falling for the scam? The first step is being aware that this type of call is a possibility. 

If you get a call for help from one of your loved ones, remember that it could very well be a robot talking instead. To make sure it is actually a loved one, attempt to verify the source. 

Also: The looming horror of AI voice replication

Try asking the caller a personal question that only your loved one would know the answer to. This can be as simple as asking them the name of your pet, family member, or other personal fact. 

You can also check your loved one's location to see if it matches up with where they say they are. Today, it's common to share your location with friends and family, and in this scenario, it can come in extra handy.   

You can also try calling or texting your loved one from another phone to verify the caller's identity. If your loved one picks up or texts back and doesn't know what you're talking about, you've got your answer.

Lastly, before making any big monetary decisions, consider reaching out to authorities first to get some guidance on the best way to proceed. 

Artificial Intelligence

The impact of artificial intelligence on software development? Still unclearAndroid 14's AI-generated wallpapers are super fun. Here's how to create themAI aims to predict and fix developer coding errors before disaster strikesGenerative AI is everything, everywhere, all at once
  • The impact of artificial intelligence on software development? Still unclear
  • Android 14's AI-generated wallpapers are super fun. Here's how to create them
  • AI aims to predict and fix developer coding errors before disaster strikes
  • Generative AI is everything, everywhere, all at once

tag-icon Tags chauds: Intelligence artificielle Innovation et Innovation

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.