PHNOM PENH: As technology continues to evolve at a rapid pace, the rise of artificial intelligence (AI) and its various applications has drawn both excitement and concern.
One such technology, “deepfakes”, is now creating waves globally, particularly in countries like South Korea, where it has become a tool for digital sex abuse.
Though Cambodia has yet to face the same surge in deepfake-related crimes, the nation must stay vigilant in safeguarding its people from the growing dangers posed by this technology.
Deepfakes, a portmanteau of “deep learning” and “fake”, involve AI-generated content that can manipulate videos, images, or audio to make it appear as if someone is saying or doing something they never did.
While deepfakes have legitimate uses in entertainment, such as in films or virtual presentations, they have also emerged as tools for misinformation, fraud and abuse, particularly against women and vulnerable individuals.
In South Korea, deepfake technology has been exploited to create and distribute fake sexual images and videos, often targeting women and teenage girls.
The culprits use platforms like Telegram to share the illicit content, causing severe emotional and psychological harm to the victims.
“In recent weeks, a large number of Telegram chatrooms – many of them run by teenagers – were found to have been creating sexually explicit ‘deepfakes’ using doctored photographs of young women,” according to a BBC report last week.
This troubling trend has led to public outrage and pressure on the government to crack down on perpetrators, even as funding for prevention programmes is being cut.
Despite Cambodia not yet experiencing widespread misuse of deepfakes, experts urge the country to take preemptive measures.
Chin Sovann, director of the Technology Department at the Ministry of Industry, Science, Technology and Innovation, highlighted the importance of awareness and vigilance.
“Deepfakes … use Generative Adversarial Networks (GANs) to manipulate images or videos, transforming them into realistic depictions of people or objects,” Sovann told The Post.
He noted that while the technology is often applied in the entertainment industry, it has a darker side.
“Deepfake technology has also been misused, particularly in the creation of inappropriate content, such as in the adult film industry, where it is sometimes referred to as deepnude,” he said.
Sovann emphasised the need for Cambodian citizens to be cautious of potential scams or fraudulent activities involving deepfakes.
“Avoid using questionable sources or sharing personal information that could lead to malicious activities, such as identity theft or data breaches carried out by anonymous attackers,” he cautioned.
Lim Sangva, founder and CEO of ArrowDot, a local automation solutions company, acknowledged the benefits of deepfake technology for certain applications, but warned about its potential for harm.
“It’s useful when we want to present something without having to perform ourselves,” he said. “We can now use more advanced technological tools to create deepfake images or videos.”
However, Sangva also pointed to the dangers of deepfakes being used for malicious purposes, such as creating fake videos of celebrities or even everyday people for profit.
“As of now, there is no crime involving the use of deepfake technology in Cambodia, but we need to be proactive,” he told The Post.
“Cambodia should implement both technology and legal frameworks to regulate the use of such technologies.”
Sangva also raised concerns about the use of deepfake voice technology, which could be used to deceive relatives or friends into lending money by mimicking an individual’s voice.
“Just as software can create deepfakes, there must be software to detect and identify them,” he added.
In South Korea, those found guilty of creating sexually explicit deepfakes can be jailed for up to five years and fined up to 50 million won (about US$37,270).
The Post was unable to reach Sok Nithya, director of the Ministry of Interior’s Anti Cybercrime Department, for comment. – The Phnom Penh Post/ANN