A cyber security expert has urged Australians to be vigilant after recent data research revealed consumers and businesses are being targeted by deepfake scams more often as AI technology continues to surge.
Deepfakes are images, videos or audio recordings altered by AI technology in a way to impersonate someone saying or doing something they did not actually say or do.
The technology can be used by fraudsters to manipulate consumers and businesses in an attempt to defraud them of money.
Statistics revealed by MasterCard Australasia showed that Aussies are seemingly more at risk than ever when it comes to being on the other end of scams.
As many as 36 per cent of Australians said they were targeted in the past 12 months by deepfake scams and were swindled out of tens of millions of dollars total, while 20 per cent of Australian businesses said they received deepfake threats in the last year.
Mallika Sathi, Vice President, Security Solutions at Mastercard Australasia, said many of the victims of the scams were not aware they had been targeted, signalling it was just the “tip of the iceberg”.
Mr Sathi said while deepfake technology was “advancing very quickly”, consumers and businesses should turn their concern into an understanding.
“Deepfake technology is advancing very quickly, so it’s absolutely something Australians should be aware of, however instead of worrying we should focus on understanding this technology works and how to protect ourselves against it,” he told SkyNews.com.au.
“As deepfakes can be utilised in many different types of scams, including video, images and audio, we encourage Australians to remain informed and vigilant, as the threat increases with the development of AI technology.”
Well-known celebrities are more often victims of deepfakes due to the volume of them speaking or doing things, while having their face on a scam makes it “more compelling” for consumers if they come across it.
Research found older generations are among those who feel less confident about being able to detecting a deepfake scam if they came across one.
One in three Australians said they were not confident in detecting them, meaning they were more vulnerable to falling victim to a scam.
Mr Sathi recommended families sit down and work out a codeword system where they could identify if a call or message perceived to be coming from a family member was really them, by asking them to provide the codeword.
The rise of AI technology has also led to a majority of Australians (60 per cent) losing trust in social media, while 47 per cent are losing in faith in calling and 46 per cent with messaging.
With Australian businesses also facing higher risks of attacks, Mr Sathi said they should begin building “digital literacy”.
As less than half of the companies surveyed have conducted cyber security training with their employees, it has been advised staff become educated on scamming techniques such as “phishing emails, malicious links in texts, bogus websites, scam calls, manipulated invoices and deepfake social posts”.
“Businesses should encourage their employees to verify the source and check the identity of the sender before proceeding. If in doubt, trust your instincts,” Mr Sathi said.
“With hybrid working, ensure employees use a VPN and complex passwords or pass phrases. For example, using passwords with three to four words together, versus random letters and numbers so it is, easier to remember and harder to guess.”
The Albanese government has been working to protect Australians from scams, unveiling the scam prevention framework for public consultation in September.
Last week, the government announced it would provide $14.7 million over two years to the Australian Financial Complaints Authority (AFCA) allowing a clear pathway for scam victims to seek compensation.
Under the government’s new actions, the bank and social media platform where a scam occurs could be liable if adequate protections have not been put in place.
“Our scams crackdown will cut off the avenues scammers use to target Australians by setting a high bar for what businesses must do to prevent them,” Assistant Treasurer and Minister for Financial Services Stephen Jones said in a statement.
“Scam victims will have a clear pathway for redress.
“We want victims of scams to know the Government has their backs, and we want businesses to understand that they have a responsibility to protect Australians from these often devastating scammers.”
Mr Sathi praised the government’s work in providing protections against scams but said Australians needed to do their part to prevent it from happening to them.
“Everyone has a role to play in this effort, especially individuals who should stay informed about deepfakes and practice good cybersecurity habits, such as being cautious about sharing personal information online,” he said.
“Businesses can contribute to this by providing regular cybersecurity training for their employees, educating about common scam techniques and encouraging a culture of vigilance.
“Ultimately, it’s about being informed, alert and supportive of each other’s efforts to create a safer digital environment.”