Kidnapping Scams Go High-Tech with AI-Mimicked Voices

Cybercriminals are using artificial intelligence (AI) to mimic the voices of people in kidnapping scams. The scams are rare, but they are happening, and they can be very convincing.

You may feel confident in your ability to avoid becoming a victim of cyber scams. You know what to look for, and you won’t let someone fool you. Then you receive a phone call from your son, which is unusual because he rarely calls. You hear a shout and sounds resembling a scuffle, making you take immediate notice. Suddenly, you hear a voice that you are absolutely certain is your son, screaming for help. When the alleged kidnappers come on the line and demand money to keep your son safe, you are sure that everything is real because you heard his voice.

Unfortunately, scammers are using artificial intelligence (AI) to mimic the voices of people, potentially turning these fake voices into things like kidnapping scams. This particular scam seems to be rare, but it’s happening.

In one case, a woman received a phone call from her “son” who claimed to have been kidnapped. The woman was so convinced that it was her son that she sent the kidnappers $10,000. It was only later that she realized that she had been scammed.

The scammers are using AI to create voices that sound extremely realistic. They can even mimic the sound of your loved one’s voice, making it difficult to tell the difference between a real and a fake call.

If you receive a phone call from someone who claims to be a loved one who has been kidnapped, do not send any money. Instead, hang up the phone and call the police.

Here are some tips to help you avoid becoming a victim of a kidnapping scam:

Never send money to someone you don’t know.

If you receive a call from someone who claims to be a loved one who has been kidnapped, hang up the phone and call the police.

Do not give out personal information over the phone, such as your name, address, or phone number.

Be aware of the signs of a scam. If something seems too good to be true, it probably is.

If you are unsure about whether or not a call is legitimate, hang up the phone and call the police.

It is important to be aware of the potential for AI to be used in scams. By following these tips, you can help protect yourself from becoming a victim.

The FBI has issued a warning about the use of AI in kidnapping scams.

The Better Business Bureau has also issued a warning about these scams.

There are a number of resources available to help you avoid becoming a victim of a scam.