AI’s Role in Future Scams: Unveiling the “Hi Mum” WhatsApp Trick

Fraud stamp and red stamped imprint on paper

parents through the “Hi Mum” WhatsApp scam, with experts warning these sophisticated deceptions are evolving at “breakneck speed.”

Key Takeaways

  • The “Hi Mum” scam involves criminals impersonating children in distress, claiming they’ve lost or broken their phone and urgently need money
  • Fraudsters are now using AI voice cloning technology to create convincing voice messages that sound like real family members
  • UK victims lost £226,744 to these scams between 2023 and 2025, with sons being the most successfully impersonated family member
  • Security experts recommend calling the person’s actual number to verify requests and establishing family passwords for emergency communications
  • As AI technology advances, these scams are becoming increasingly sophisticated and harder to detect

The Growing Threat of “Hi Mum” Scams

A disturbing trend in digital fraud is sweeping across the UK and beyond, targeting the most vulnerable emotional connection: the parent-child relationship. The “Hi Mum” or “Hi Dad” WhatsApp scam begins with a simple text message from an unknown number claiming to be a child who has lost or broken their phone. The approach seems innocent enough—a family member needing to update their contact information—but quickly escalates into urgent requests for financial assistance, security codes, or personal information that can be exploited for theft and identity fraud.

“Hi, Mom, this is my new number. Can you save it and send me a message on WhatsApp as soon as you see this?” (Example of a scammer’s message)

The scam has proven remarkably effective, with Action Fraud reporting that UK victims lost £226,744 between 2023 and 2025. Data from Santander UK reveals an interesting pattern: scammers impersonating sons have the highest success rate, followed by daughters and mothers. The effectiveness of these scams relies on triggering parental protective instincts and exploiting the natural urgency parents feel when their children claim to be in trouble.

AI Voice Cloning: The Next Frontier in Scam Evolution

What makes this scam particularly concerning to security experts is its rapid evolution through AI technology. Traditional text-based scams are now being supplemented with voice notes and calls using AI-generated voice cloning technology that can mimic a family member’s voice with disturbing accuracy. This technological advancement represents a dramatic escalation in how difficult these scams are to detect, especially for less tech-savvy individuals who may be unaware of AI’s capabilities.

“We’re hearing of instances where AI voice impersonation technology is being used to create WhatsApp and SMS voice notes, making the scam seem ever more realistic,” said Chris Ainsley of Santander UK.

The scammers’ methodology follows a consistent pattern: they initiate contact with friendly messages from unknown numbers, create a sense of urgency around financial help, and direct payments to unfamiliar accounts under the pretext that their usual banking access is unavailable. These tactics play on both emotional vulnerability and the confusion that comes with unexpected emergency situations, creating a perfect storm for financial exploitation.

Protecting Yourself from Advanced Impersonation Scams

As these scams become more sophisticated, traditional verification methods are increasingly important. Security experts unanimously recommend one key protective measure: direct voice contact through known, verified phone numbers. No matter how convincing a text message or voice note might seem, taking the time to call your loved one’s actual number is the most reliable way to confirm their identity and situation.

“If you’re ever asked for money out of the blue on any social or communication platform, verify the request by picking up the phone,” says Ainsley.

Additional protective measures include establishing family passwords for emergency situations, limiting personal information shared on social media, activating two-factor authentication for financial accounts, and using security tools like Bitdefender Scamio to screen suspicious messages. If you do fall victim to such a scam, contact your bank immediately to stop the payment, report the scam to WhatsApp, forward suspicious texts to 7726, and file a report with Action Fraud. As our government continues to fail in protecting citizens from increasingly sophisticated digital threats, personal vigilance becomes our primary defense.