Deepfakes leverage artificial intelligence to produce highly realistic yet fake videos or audio of people, making it seem like they are saying or doing things they never actually did. Deepfake technology is being used to scam people on live video calls, posing significant risks to individuals and organisations alike.
On a video call, how do you know if you’re speaking to the real person or an impersonator using deepfake technology? In a world filled with deepfakes, seeing is no longer believing, so understanding and combating the dangers they pose is more crucial than ever. For a deep dive into what exactly a deepfake is and how to spot them, check out our Deepfake 101 blog.
The Dangers of Deepfakes
In a recent Trend Micro study conducted in June 2024, 64% of people reported extremely concerned with criminals using deepfakes to scam people1. Deepfake video calls in particular present a wide range of dangers, including identity theft and fraud, where scammers convincingly impersonate others for financial gain, as seen in a case where a Hong Kong finance worker was duped into transferring $25 million.
What’s particularly concerning about deepfake video calls is that cybercriminals can convincingly impersonate anyone—a friend, a new romantic partner, or a prospective employer—making it increasingly challenging to verify the true identity of the person you are communicating with.
Top Tips to Spot Deepfake Video Callers
Below are some best practices for identifying and mitigating the risks of deepfake video calls:
Skeptical
Adopting a skeptical mindset is a great practical real-time defence. When interacting with someone online or over video call:
- Ask questions that only the real person would know the answers to, or refer to a fake event or a made-up name of a friend. How they answer or respond will help confirm their identity.
- Trust your instincts. If something feels off—even if they look, sound, and act like someone you know—end the video call and contact the real person directly, ideally using another method of communication (phone call, for example).
- Additionally, be extra suspicious of out-of-the-blue requests for money or personal information. As a rule, never send money or share personal information with anyone without properly verifying their identity. Remember, a stranger could be impersonating someone you know.
Stay Updated and Share Knowledge
Regularly stay informed about the latest deepfake and cybersecurity news and best practices.
Sharing knowledge and resources with friends and family can help raise awareness about the potential dangers of deepfakes and create a more vigilant community. Hint: You can share this article with your friends and family by clicking the social media icons at the bottom of the post!
Use Technology to Spot Deepfakes
Deepfakes are becoming increasingly sophisticated, making it difficult for humans to tell the real from the fake. To combat this, we’re launching Trend Micro Deepfake Inspector for Windows PC. It’s a tool that can help you verify the identities of people on video calls in real time, ensuring they are not using deepfake technology to alter their appearance.
Don’t Overshare Online
Be cautious about the information you share online, as oversharing could arm scammers with information they could use to deceive you.
Report Suspected Deepfakes
If you encounter a suspected deepfake designed to scam you, defraud you, or steal your identity, report it to the relevant authorities, such as your local police department, consumer protection agencies (and to the platform where you saw it. Prompt reporting can help mitigate the spread of fake content and protect others from being deceived by the deepfake.
Challenging, But Not Unsolvable
There’s no doubt about it, deepfakes pose significant challenges to our privacy, security, and society in general. However, with the right critical mindset, ongoing education, and tools like our Deep Fake Inspector for Windows PC, we can all effectively mitigate the risks and protect ourselves from potential harm.
1 Trend Micro Snap Study – Consumers and Deepfakes, Conducted June –July 2024, US & Australia, N=2097