South Korea confronts unseen scars of AI sex crimes: ‘the law didn’t protect me’
“I felt like I was alone,” Ruma, 31, told This Week in Asia. “I had to gather the evidence myself and even identify the culprit … it felt like I was doing the police’s job.”
The suspect, a college alumnus surnamed Park, had targeted 61 women, distributing 1,852 AI-generated explicit images through the Telegram messaging app. Park and his accomplice, known as Kang, referred to themselves as “photo composition experts”.
In September, Park and Kang were sentenced to 10 years and four years in prison, respectively. They later filed an appeal and on Friday, an appellate court reduced their sentences to nine years for Park and three years six months for Kang, taking into account that they had reached a settlement with some of the victims.
Ruma’s case is emblematic of a surge in digital sex crimes in South Korea. More than 18,000 such cases were documented in 2024 – a 12.7 per cent increase from the previous year – according to the country’s gender equality and family ministry.
Especially alarming is the explosion of deepfake technology used in these crimes, with such cases rising by 227 per cent last year alone. Deepfakes use AI to mimic a person’s face, voice, or actions – with victims often inserted into fabricated pornographic content.
Source link