How to Identify AI *Kidnapping* Scam
#NEWSLETTER | AI phone scams, where a loved one has been *kidnapped* are on the rise, but this particular fraud is only as effective as our willingness to believe what we hear...
Last week, a school district in Pennsylvania sent out a warning that local parents had been called from (what appeared to be) the school’s main phone number and told their child had been kidnapped (…the parents could “hear” their child in the background). As one parent talked to the scammer, the other called the school separately and was able to confirm that their child was safe (…the right way to handle).
Oakland Unified School District sent out a similar warning late last year after several families received similar scary phone calls. The fraudsters seemed to be targeting elementary school parents in that particular community.
An Arizona mother’s story was elevated in 2023 after she shared it publicly in testimony to Congress with the FTC following with alerts to consumers regarding these voice cloning scams.
Even if you’ve been untouched, and are relatively unconcerned about being duped, it remains true that knowledge is our collective superpower 🦹🏼⚡️💡🦸🏽🔥 and the more awareness there is regarding these issues, the faster we can help one another to shut them down.
What’s Happening?
As the ability to manipulate image, video, and audio data with new artificial intelligence tools has gotten more sophisticated, fraudsters are relishing the opportunity to pounce.
In fact, the technology is so rife with potential for exploitation that even the famed investor Warren Buffett said last week: “If I was interested in investing in scamming, it’s going to be the growth industry of all time.”
Of course, bad actors are always “early adopters” of new technology, but their hustle does requires our collective naïveté to work.
So think of this one as “Nigerian Prince letter scam” supersized by AI and fueled by our penchant for sharing video, image, and audio content publicly online.
The good news is that the scammer’s game is easily given up, and equally, we can do a lot to prevent being duped in the first place. Read more here ⤵️
What Else to Know?
We do need to understand that fraudsters are primarily sampling voice content from public social media accounts.
And no victim shaming here…families can, and should, make their own decisions around publicly sharing video/audio content… but it does help to know how fraudsters get the data.
We can also work harder to not give criminals a reason to go after us in the first place. Preventing hacks and not falling victim to phishing is a good start. Read last week’s update on protecting yourself here ⤵️
How about Deepfakes?
Specific to this scam, video hasn’t been employed (yet). Voice is more easily manipulated and requires less expertise (so far). But it’s safe to assume that cheap, easy and believable video is not far off. The good news is that the same approach to debunking and preventing these will still hold true.
[For paid subscribers with access to the “Resources” section, I also created a handy tip sheet to print out and stick on the fridge… Find here ⤵️]
Also on Deepfakes…
It’s worth noting that deepfake nudes/porn is a different type of hell, and a very real crisis now too. Unfortunately this category of deepfake doesn’t have to be realistic, or particularly sophisticated, to cause havoc and distress. It’s an issue warranting a separate focus, so look for content on the site this week and in next week’s newsletter.