Can AI help someone stage a fake kidnapping scam against your family?

You may feel confident in your ability to avoid becoming a victim of cyber scams. You know what to look for, and you won’t let someone fool you.

Then you receive a phone call from your son, which is unusual because he rarely calls. You hear a shout and sounds resembling a scuffle, making you take immediate notice. Suddenly, you hear a voice that you are absolutely certain is your son, screaming for help. When the alleged kidnappers come on the line and demand money to keep your son safe, you are sure that everything is real because you heard his voice.

Unfortunately, scammers are using AI to mimic the voices of people, potentially turning these fake voices into things like kidnapping scams. This particular scam seems to be rare, but it’s happening.

 

How frequent are fake kidnapping calls enhanced with AI?

You may recall the incident in late 2023 involving a fake dental plan advertisement that featured Tom Hanks.

 

MORE: THE ‘UNSUBSCRIBE’ EMAIL SCAM IS TARGETING AMERICANS  

 

How does an AI fake call work?

 

MORE: HOW TO GUARD AGAINST BRUSHING SCAMS 

 

Do I have to worry about falling for a fake AI audio kidnapping scheme?

 

MORE: HOW SCAMMERS USE AI TOOLS TO FILE PERFECT-LOOKING TAX RETURNS IN YOUR NAME

 

Steps you can take to protect yourself from a fake kidnapping scam

1) Ask your loved ones to keep you informed about trips:

2) Set up a safe word or phrase:

3) Use privacy settings on social media:

How to maintain and protect your online privacy

4) Try to text your loved one:

5) Stay calm and think things through:

 

Kurt’s key takeaways

Are you concerned about how scammers may take advantage of AI to create new scams? Let us know in the comments below.

FOR MORE OF MY SECURITY ALERTS, SUBSCRIBE TO MY FREE CYBERGUY REPORT NEWSLETTER HERE

Related posts

Unbeatable Cyber Monday deals extended

Quick fixes to stop your Windows PC from crashing

Don’t get fooled by fake phone updates and notifications