The rise of AI is bringing with it a host of scary new scams. Here are four to be aware of.
Whether you’re an artificial intelligence (AI) enthusiast or sceptic, it’s impossible to deny that this technology has the potential to change our lives.
And in many cases, these changes will be for the better as AI can help people with hearing loss communicate, analyse climate change data and even write operas.
However, these remarkable leaps forward can’t disguise the fact that AI also comes with some pretty sinister applications in the wrong hands.
Key among these is fraud.
Trade body UK Finance recently issued a report revealing that Brits lose a staggering £3.2 million a day to scammers in which it was also claimed that AI-related crime is a chief concern for many people.
In this article, we look at four of the most alarming AI-powered cons.
£580m lost to fraud: the scams you need to watch out for
1. Faked insurance claims
According to research from This is Money, increasing numbers of UK drivers are employing AI-generated images – known as deepfakes – to make false insurance claims.
For example, a fraudster may manipulate an image of a car to appear as though it has been involved in a serious accident or subjected to vandalism.
Other criminals are using deepfake technology to create images of household objects that appear damaged in order to file a fraudulent home insurance claim.
Worryingly for insurance companies, it can be near impossible to tell these falsified images apart from the real thing.
Why we should care
Insurance fraud has a huge potential payoff for criminals.
According to the Association of British Insurers (ABI), the average case of fraud was worth £15,000 in 2023.
However, it isn’t just insurance companies that pay the price for these crimes. In fact, the ABI estimates that fraud adds as much as £50 per year to the average policyholder’s car and home insurance.
In total, this means these con artists are costing the typical Brit a whopping £100 per year.
Read more about the sneakiest AI scams of 2024
2. Online listings for non-existent products
With many of us loving to shop online, it’s hardly a surprise that scammers are turning this to their advantage.
As part of this con, cybercriminals typically set up a shop using a legitimate e-commerce platform and use AI to create images of non-existent goods that they can upload to these stores.
Often this scam relies on entering a description of a fake item into an AI-powered image-generating website.
Within seconds, the site will create an entirely convincing photo that the fraudster can add to their dodgy listing.
Why we should care
Once you’ve been tricked into sending money to a scammer, the chances of getting it back can be slim.
First off, getting a refund may rely on tracking down the perpetrator – often near impossible in cyberspace.
Moreover, if you attempt to claim compensation from your bank, it might argue that you are liable for the loss as you ought to have checked the seller’s legitimacy before transferring the funds.
3. Bank ID verification fraud
With some banks requiring new customers to provide a video of themselves saying a particular phrase when they open an account, scammers can employ AI-generated recordings to bypass security checks.
In one case that went viral last year, a journalist ‘broke into’ a bank account (admittedly his own) using an AI-generated version of his own voice.
Bear in mind, this incident took place more than a year ago – which is an extremely long time in the world of AI.
If this was possible in 2023, who can say what sophisticated fraudsters will be able to achieve in 10 or even five years’ time?
Why we should care
With research from fraud prevention body CIFAS revealing that identity theft makes up almost 70% of cases filed to the National Fraud Database, this type of crime is clearly a fruitful area for scammers.
For many victims, the financial and emotional costs can be devastating and it may take years to repair the damage caused.
You can read more about what to do if your identity is compromised in this handy guide,
4. UK business loses almost £20 million
At the start of the year, engineering firm Arup lost almost £20 million as part of an AI-based scam.
The crime relied on using a faked video of a senior executive to trick a Hong Kong-based employee into sending the funds.
So far, no arrests have been made.
Although this incident is an extreme example, attacks of this nature are far from unusual.
According to a report from software engineering company ISMS online, almost a third of UK information security professionals have experienced a deepfake attack in the past 12 months.
Why we should care
Although it may be tempting to believe that it is a business’s own problem if it gets scammed, cases such as these are evidence of a wider potential threat.
If scammers can dupe a business into handing over its own cash, it raises concerns over the safety of any money and personal data we entrust to the organisations we deal with every day.
Human jobs that robots are stealing
AI as a fraud detection tool
Although this technology can clearly be a valuable weapon for scammers, there is a real catch-22 when it comes to AI and fraud.
Ironically, the same technology behind many new online attacks is also at the cutting edge of fraud detection.
By using AI and machine learning (in which computer systems can learn without human instruction), banks and insurance companies can analyse vast amounts of data to identify suspicious patterns that could indicate criminal activity.
For instance, banks can use this technology to detect uncharacteristic transactions in a customer’s spending, which could suggest that their account has been compromised.
Likewise, insurers can use AI to review claims history to identify patterns of potentially criminal behaviour.
If there is enough data, this could help expose fraud on a large scale, possibly even related to organised crime.
How to get help
While AI is adding to the sophistication of scams, there are organisations that can help if you fall victim to fraud.
These include:
Action Fraud
Citizens Advice
MoneyHelper
Financial Conduct Authority