Criminals Exploit AI to Scam Parents with Cloned Voices of Their Children


  • Criminals use AI to clone children’s voices and scam parents, exploiting emotions and vulnerabilities.
  • AI technology enables scammers to achieve startling voice-matching accuracy.
  • Vigilance and verification are essential in protecting against AI-driven voice cloning scams. #OnlineSecurity #AIscams

In a disturbing revelation, criminals are harnessing the power of artificial intelligence (AI) to hack children’s social media accounts and clone their voices, perpetrating scams that prey on parents’ emotions and goodwill. This alarming trend has brought forth concerns about online security and the vulnerability of individuals to AI-enabled deception.

The AI-Driven Voice Cloning Scams

Security experts warn that even the most basic scammers are now employing readily available AI tools online to transform a mere three seconds of a child’s voice into deepfake audio files exhibiting an 85 to 90 percent match. With additional effort, hackers can achieve an astonishing 95 percent voice match, as research from security firm McAfee revealed. This heightened accuracy allows criminals to exploit the vulnerability of friends and family members, tricking them into parting with their money.

The ‘Hi Mum, Hi Dad’ scam

Fraud experts have issued fresh warnings about the ‘hi mum, hi dad’ texts, where scammers manipulate parents into believing their children are in dire trouble. Vonny Gamot, head of Europe, Middle East, and Africa at McAfee, highlights that AI is now a catalyst for such scams, breathing new life into old tricks. The rise in younger generations using social media and the preference for sending voice notes over text messages has fueled the surge in this type of fraud.

How criminals execute the scam

Hackers are strategically targeting children’s social media accounts on platforms like Facebook, Instagram, Snapchat, and TikTok to extract voice samples. These samples are sourced from videos posted online or voice messages, often requiring no more than three seconds of audio.

Creating convincing scams

Once hackers have collected these voice samples, they employ AI technology to craft fake voicemails or messages. These are then sent to the ‘mum’ and ‘dad’ contacts saved in the children’s phones. When a parent hears what appears to be their child’s voice in distress, panic sets in, compromising their judgment and common sense.

The widespread impact

The impact of voice cloning scams extends beyond children. A McAfee survey found that 50 percent of adults share their voice data online at least once weekly through social media or voice notes. Similarly, 65 percent of adults lack confidence in distinguishing between a cloned voice and the real thing.

Financial consequences

Falling victim to an AI voice scam can have dire financial consequences. Forty percent of those who fell prey to such scams reported losses exceeding £1,000, while 6 percent suffered losses ranging from £5,000 to £15,000.

Mitigating AI voice scams

McAfee advises potential victims of AI voice scams to remain calm and take a breath. They should then verify the authenticity of the distress message by contacting the friend or family member in question directly, ideally through a known and trusted means.

The growing digital generation

Vonny Gamot emphasizes the significance of the digitization of younger generations. Children are gaining access to digital devices at increasingly younger ages, with schools incorporating online homework, further expanding their digital presence.

The rise of AI-powered voice cloning scams underscores the evolving landscape of online fraud. Criminals are leveraging technology to manipulate the emotions of parents and loved ones, putting them at risk of significant financial losses. As the digitization of younger generations continues, individuals must remain vigilant and cautious when confronted with suspicious messages or distress calls. Awareness, education, and adopting protective measures are crucial in safeguarding against these AI-driven threats in the digital age.

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Share link:

Glory Kaburu

Glory is an extremely knowledgeable journalist proficient with AI tools and research. She is passionate about AI and has authored several articles on the subject. She keeps herself abreast of the latest developments in Artificial Intelligence, Machine Learning, and Deep Learning and writes about them regularly.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Related News

Subscribe to CryptoPolitan