Santander has created its own deepfake video to show customers how convincing they can be, as only a fifth of Brits believe they could identify one.
The lack of awareness of the technology was further evident in that over half (53%) had not heard of the term ‘deepfake’ or understood what it meant when asked by Santander.
However, 54% are also worried about deepfakes being used for fraud, while one in two (51%) of Santander’s 2,000 respondents fear a family member could be a victim of the fraud.
So, could you tell the difference between what’s real and what’s a deepfake?
In a bid to highlight AI scam risks, Santander created its own deepfake video of financial influencer Mr Money Jar.
In the clip, an AI version of Timi Merriman-Johnson told viewers about “an incredible investment opportunity the mainstream banks don’t want you to know about”. As part of the ‘deal’, the digitally manipulated version said “you could double your money every two weeks”.
As convincing as the video may seem, Merriman-Johnson tells viewers to keep an eye out for his ‘wonky’ hand movements and unnatural changes to his lips as his fake version speaks. Another feature that suggests a scam is a blurry background featured behind the AI-created person.
Deepfakes seen the most on Facebook by Brits
While many Brits are unaware of the term, more than a third (36%) said they had seen a video using AI technology on social media. All of the social media platforms had a similar rate of videos seen by users.
Facebook was the most common social media platform, with over a quarter (28%) witnessing a deepfake video on the page, with X, formerly Twitter, (26%), TikTok (23%) and Instagram (22%) being cited slightly less regularly.
But what exactly is a deepfake? It is a video, sound or image that digitally manipulates someone to trick viewers into believing it is that person.
During the rise of deepfake scams in the last year, celebrities, MPs, and other influential people have been replicated, including Taylor Swift and MoneySavingExpert founder Martin Lewis.
Fraudsters have used the videos to trick victims into transferring money under the illusion it is a ‘win-win’ investment idea, usually involving cryptocurrency.
Following its usage by scammers, the concerns of deepfakes are mostly regarding people’s personal finances, as 54% fear it will be used for stealing people’s money, over other opportunities like election manipulation or misuse of personal data.
Generative AI is developing at breakneck speed
Chris Ainsley, head of fraud risk management at Santander, warned of the pace at which the technology is progressing.
Ainsley, who had his own version created, said: “Generative AI is developing at breakneck speed, and we know it’s ‘when’ rather than ‘if’ we start to see an influx of scams with deepfakes lurking behind them.
“We already know fraudsters flood social media with fake investment opportunities and bogus love interests, and unfortunately, it’s highly likely that deepfakes will begin to be used to create even more convincing scams of these types.”
Ainsley added: “More than ever, be on your guard, and just because something might appear legitimate at first sight doesn’t mean it is. Look out for those tell-tale signs, and if something – or someone – appears too good to be true, it’s probably just that.”
The lender has provided three tips to avoid being conned by a deepfake.
How not to be duped:
- Most deepfakes are still imperfect. Whether there’s blurring around the mouth, less blinking than normal, or odd reflections – look out for the giveaways.
- Ask yourself the same common-sense questions you do now. Is this too good to be true? If this is real, why isn’t everybody doing this? If this is legitimate, why are they asking me to lie to my family and/or bank?
- Know what types of scams deepfakes are likely to be used for. Deepfakes are likely to be used by criminals to scam people through investment scams and impersonation fraud, such as romance scams. If you know the tell-tale signs of these scams, you’ll know how to spot them – even if a deepfake has been used.