How to protect yourself from deepfake scams
Richard Howard • August 17, 2020if( has_post_thumbnail( $post_id ) ): ?>
Even if you’re not one to follow the cutting edge of technology, chances are you’ve heard the term deepfake by now and may even have heard of deepfake scams. But what is a deepfake?
Here’s the skinny: using artificial intelligence, those crazy tech folk have created technology with which one can manipulate audio, photos, and video like never before. “Synthetic media” is actually the collective term with “deepfake” referring to visual media, but we’ll use deepfake as a catch-all for simplicity.
While it’s mostly been used for entertainment like this clip of “Jay-Z” reciting Hamlet and a mind-bending visual accompaniment to celebrity impressions, it’s clear there can be a dark side to this technology. Non-ethical parties could use deepfake software to mislead the public or for political gain.
It could affect you even more directly. The potential of deepfake fraud, for instance, is astronomical. As the technology gets more and more convincing, it’s a good idea to consider ways to protect yourself from deepfake scams.
How deepfakes work
While we won’t delve too deeply into the technical side, it’s important to understand how all this works when discussing how deepfake scams could be developed. Using a combination of artificial intelligence and machine learning (the “deep” in “deepfake” stands for “deep-learning”), key characteristics of a person’s appearance or voice can be overlaid on another’s. In 2017, an AI startup called Lyrebird released voice clones of President Trump, President Obama and Hilary Clinton. Since then, the quality has greatly improved, and there is now even a Lyrebird app for public use. In fact, there are a host of voice cloning apps and deefake apps including Zao, REFACE, Face Swap, and Overdub.
That’s what makes the potential for widespread deepfake fraud so high: the availability of the software and of the media it uses. You don’t need expensive hardware and expertise, with deepfake software there’s literally “an app for that.” And while the technology has been around for a while, the amount of media available today has made it much more viable. If you can get video or audio of a person, you can create a deepfake. The more source material you can feed the algorithm, the better the deepfake.
How voice cloning and deepfake scams (could) work
Thankfully, deepfake scams aren’t commonplace, but a high-profile example gives a pretty good template of what they could look like. Last year, the CEO of a UK energy firm followed directions given over the phone by the chief executive of the firm’s parent company to transfer €220,000 (about $243,000 USD) to one of their suppliers. The problem was, it wasn’t the parent company’s CEO speaking at all but rather a convincing example of voice cloning. According to the scam victim, the voice was indiscernible from the real thing, and he only caught on due to certain inconsistencies including the number being Austrian when it should have been German. In all likelihood, this scam utilized an easily-downloaded deepfake app.
Now imagine you’re buying a house. Your real estate agent calls you excitedly to say your offer has been accepted, but you’ve got to transfer the downpayment now. He’s calling from his private phone because he left his work phone at the office, but never mind that, you’ve got to lock this in before the buyer moves on! He emails you the banking information, you shoot the cash over and you’re the proud wonder of a new home. Except now that number isn’t returning your phone calls, and then you notice the email is from email@example.com (that’s not how you spell little). You call Matt, and your worst fears are realized—he has no idea what you’re talking about. Sound outlandish? It’s not. The threat is real enough that major banks are warning their customers to be on the lookout.
Video deepfakes have prompted fears of deception on a larger level. The first many people heard of this technology was when the disturbing practice of superimposing celebrities’ faces into pornographic films came to light. Naturally, the same could be done to unsuspecting everyday people like us once the malicious party has access to video footage or images. Another serious threat is in the form of public or political misinformation. Deepfakes of former and current political figures have already been created both for entertainment and educational purposes. It’s a near certainty that at some point, unscrupulous parties will begin using similar deepfakes for their own gain.
How to protect yourself from deepfake scams
Those of us who think of ourselves as tech-savvy may feel like we’re impervious to the plethora of email and robot-calling scams. And some of us would be right. However, this is a whole new ball game. We’re used to trusting our eyes and ears, so particularly well-executed deepfakes may be harder to spot than you think, and they’re only going to get better. That’s why many experts are recommending strengthening your general security practices in addition to relying on deepfake detection.
Security practices to protect from deepfakes
- Trust but verify: That voice message may sound real. Regardless, best to call back a number you know to be correct in order to verify that transfer your colleague requested.
- Consider the source: Unless that request for donations came from a source you’re familiar with and trust completely, don’t click that button until you verify. Similarly, if you’ve never heard of the news source, visit one you trust to confirm.
- Look for little inconsistencies: Take a close look at the phone number, email, or account the audio or video came from. One wrong number of missing letter is hard to spot unless you’re always vigilant.
- Limit access to your voice and images: As we’ve learned, scammers will need recordings, images or footage of you to create deepfakes. To prevent your likeness being replicated, limit your presence on social media or make your accounts private, accepting requests only from people you trust.
That being said, there are tell-tale signs that, at least for now, are likely to show up in unprofessionally created deepfakes. Things to look out for when attempting to spot a deepfake include:
- Unnatural speech cadence
- Unnatural movement
- Unnatural blinking
- Unexpected shifts in lighting and skin tone
- Poor synchronization of lips and speech
- Low-quality audio and/or video
- Digital artifacts (“noise” in audio and video)
The future of deepfake scams
The good news is, right now successful deepfake scams like the UK energy firm fiasco are few and far between. The bad news is that their prevalence is sure to increase in the immediate future, and they’re going to become harder to detect as deepfake software improves. Your best bet is to religiously incorporate good security practices now, from basic smartphone security to the very specific techniques outlined above.