Think It’s a Loved One Calling? Think Again 

    Not long ago, hearing a familiar voice on the phone was usually enough to trust the person on the other end. But today, that assumption is becoming more risky.
      

    With the assistance of AI, scammers can now replicate someone’s voice using only a short audio clip, sometimes taken from social media or online videos. The result can sound convincing enough to fool even close friends or family members. This technology is known as voice cloning, and while it has legitimate uses, criminals are increasingly using it to create highly believable scams.

    What is Voice Cloning?

    Voice cloning is a type of AI technology that can replicate the sound of a person’s voice. By analysing recordings of someone speaking, the technology learns how they pronounce words, their tone, and the rhythm of their speech.
     

    While the technology has legitimate uses in areas like entertainment and accessibility, scammers are beginning to misuse it to create convincing phone calls that pressure people into sending money or sharing personal information.

    Red Flags of Voice Cloning

    • Urgent or emotional requests 
      • The caller may claim they are in serious trouble and need help immediately. They might say they’ve been in an accident, arrested, or stranded somewhere. Scammers rely on panic so you act quickly without thinking. 
    • Requests for money right away 
      • If the caller asks you to send money urgently, especially through gift cards, wire transfers, or cryptocurrency, it’s a major warning sign. These payment methods are difficult to trace and are commonly used in scams. 
    • Pressure to keep the situation secret 
      • Scammers may tell you not to contact anyone else about the situation. This tactic is meant to stop you from verifying the story with family members, friends, or colleagues. 
    • Calls from unknown or hidden numbers 
      • Many voice cloning scams come from unfamiliar numbers or private callers. While this alone doesn’t mean it’s a scam, it’s a sign to be cautious. 
    • Strange speech patterns or pauses 
      • AI-generated voices can sometimes sound slightly unnatural. You might notice unusual pauses, robotic tones, or words that are pronounced strangely. 

    How to Protect Yourself

    A few simple steps can help reduce the risk of falling victim to a voice cloning scam. 

    • Create a family code word 
      • Agree on a private word or phrase with close family members. If you ever receive an emergency call, ask for the code word. 
    • Verify the story 
      • If you receive a distress call, hang up and contact the person directly using a number you already know. 
    • Limit what you share publicly 
      • Audio clips and videos posted on social media can be used to train AI voice models. 
    • Be cautious with unknown callers 
      • If you don’t recognise the number, let the caller speak first before engaging. 

    Think Before You Act

    Many scams rely on creating a sense of urgency. Voice cloning takes this a step further by using technology to mimic people you trust.
     

    Taking a moment to slow down, question the situation, and verify the request can make all the difference. 

    Post a comment

    Your email address won't be shown publicly.

    0 Comments

      This website uses cookies for website functionality, traffic analytics, personalization, social media functionality and advertising. Our Cookie Notice provides more information and explains how to amend your cookie settings.