BLOG: Deepfakes, Metaverse, and future fraud threats to watch out for
The UK, like many countries, is in the throes of a cost-of-living crisis, the likes of which have not been seen in decades. Many will be concerned about their finances, with harder challenges lying ahead when energy prices and inflation climb further still.
But the cost of living is just one pertinent issue. Fraud is another. And the current economic climate is ideal for fraudsters.
Fraud is becoming more common
In the UK alone last year, fraud losses totalled an eye-watering £137bn. This growth is the result, in part at least, of our increased use of digital services, including digital banking, throughout the pandemic. In fact, the first half of 2021 saw a 285% rise in online fraud compared to the previous year. And as technology advances, criminals are updating their tactics – we must respond with greater vigilance and awareness of the threat at hand.
One of the most common types of fraud in the UK is Authorised Push Payment (APP), totalling £582.3m in 2021. Essentially, this is where a customer is tricked into authorising a payment into a criminal’s account. Propelling this volume of crime is the sophisticated talent of fraudsters to lure people into parting with personal details or making transactions to fraudulent companies and accounts.
The clumsy phishing scammer has been usurped by fraudsters who are adept at presenting themselves as an official from your bank, the NHS, or a service provider. Some will even employ long-term role play where they pretend to befriend someone online to later manipulate their trust.
Staying safe online is difficult these days. As our reliance on the digital world continues to grow – for how we shop, communicate, bank and manage our lives – we’re exposing ourselves to ever greater threats. Here are two potential applications of APP that are set to become ever more prevalent in the months and years ahead.
Some readers will have seen the eery deepfakes of political leaders online over recent years – Barack Obama famously shared one of him to raise awareness of the issue. Since then, the technology has become extremely sophisticated and has even made its way to the entertainment industry, where actors can now appear as themselves aged 40 years younger.
If you’re unaware, deepfake technology is essentially an extremely realistic ‘fake video’, which makes someone appear as if they’re communicating in real life. Such videos are made with advanced programming that compiles all available video and recorded footage of the person they’re trying to recreate; with more footage comes a more realistic and ‘true to form’ outcome. Once a base model of a person is built, with today’s technology, new sentences can be programmed for the ‘fake’ to say.
Social media has provided an opportune landscape for fraudsters to exploit people by using deepfake videos. Particularly if targeting younger people, fraudsters could potentially gain access to hours of recorded footage to build the perfect ‘fake’ model of their target.
So what does this look like in real life? It could be a parent or grandparent receiving a video message from a child or grandchild asking them to send ‘emergency’ money into their newly created bank account. Or a fraudster could imitate the voice of someone’s GP asking for personal information over the phone. It seems likely that more sophisticated methods like these will become more common in the not-too-distant future.
A shift in our relationship with social media may be required to address this threat. For instance, checking and updating your privacy settings on social networks could be a good place to start, if you haven’t already done so.
Imposters in the Metaverse
The advancement and proliferation of technology have changed – and will continue to change – the way we interact. We’re heading into a future where digital interactions can feel as vibrant and immersive as they do in real life. This is the premise of the Metaverse, a creative platform for virtual-reality experiences.
As the Metaverse advances and gains traction across society, fraudsters will be attracted to this new digital realm.
In fact, this already occurs today. Criminals present themselves online as genuine individuals seeking friendships or more, only to later exploit these relationships for financial reward. Such schemes are known as imposter scams.
A future Metaverse in which users can present themselves as whomever they wish and yet feel as authentic as real life will provide a playground for imposters. While the Metaverse may be some time away, the threat is present today. Interactions with strangers online – even people we know – must be managed carefully. Never give away any financial details online, not even to your own mother – she may well have fallen victim to a hacking attack.
Remaining alert to these ever-evolving threats is extremely important. With just the two examples here we can see that it will become more difficult for consumers to personally detect fraudsters as technology improves. While already flourishing, APP crime is likely to become even more prevalent as deepfake technology grows and the Metaverse invites us to share our lives online. The only way to combat this is to be vigilant. We must all become more digitally responsible and remember that, online, not everything is as it seems.
Sean Lynskey is chief operating officer of digital bank, Chetwood Financial