close
close

The number of emergency scams involving grandparents and families is increasing again

The phone rings. The caller claims to be one of your relatives – often a grandchild – and he or she is panicking.

They’re in some kind of emergency: a car accident, or they’ve been arrested. They need you to send money as soon as possible and of course you – grandma or grandpa – do just like any other grandparent and do what you can to help.

Yes, it’s the “grandparent scam,” but it’s not the original version. It has evolved thanks to artificial intelligence and voice cloning.

In the past year since ConsumerAffairs first reported on the anticipated threat of voice cloning, the grandparent scam has become so compelling that one man fatally shot an Uber driver because he wrongly assumed she was part of a scam to extract $12,000 in purported bond money. to grab for a cousin. But even consumer professionals are not immune. Watch this personal account of how a consumer advocate became a victim in the fight against fraud.

Business is good for the scammers. A grandparent scam ring in Canada reportedly raked in more than $2 million before being tracked down and arrested.

AI makes these scams more convincing because once they capture a few seconds of the family member’s voice, they can make that “family member” say anything just by typing in dialogue.

How do you stay safe in this AI-driven world?

With robocalls, caller ID and screening apps have done a pretty good job of cutting off these scammers, but modern software can spoof phone numbers. When a call appears to be from a known number, many people will answer the call thinking they are safe.

But cybersecurity watchers are pushing something new that scammers may not be able to ignore: “AI security words.”

This concept is gaining popularity and is quite easy to put into play. You simply encourage family members and friends to formulate unique sentences or challenge questions that only they know. This can help verify identity over the phone.

The other security thing we can all do is delete all the videos you posted on social media. “For example, if your Facebook is open, it is not locked, so anyone can view your profile.

“If there is even 15 seconds of audio available on the internet of the person whose voice you are trying to imitate, that could be enough,” Truman Kain, a cybersecurity expert from Huntress, told FOX23 News.

According to Kain, people should not leave their social media unlocked and open. Those privacy settings should be locked down, he said. And here’s a video showing how to do just that: