In South Korea, Misogyny Has a New Weapon: Deepfake Sex Videos
In 2020, as the South Korean authorities were pursuing a blackmail ring that forced young women to make sexually explicit videos for paying viewers, they found something else floating through the dark recesses of social media: pornographic images with other people’s faces crudely attached.
They didn’t know what to do with these early attempts at deepfake pornography. In the end, the National Assembly enacted a vaguely worded law against those making and distributing it. But that did not prevent a crime wave, using AI technology, that has now taken the country’s misogynistic online culture to new depths.
In the past two weeks, South Koreans have been shocked to find that a rising number of young men and teenage boys had taken hundreds of social media images of classmates, teachers and military colleagues — almost all young women and girls, including minors — and used them to create sexually exploitative images and video clips with deepfake apps.
They have spread the material through chat rooms on the encrypted messaging service Telegram, some with as many as 220,000 members. The deepfakes usually combine a victim’s face with a body in a sexually explicit pose, taken from pornography. The technology is so sophisticated that it is often hard for ordinary people to tell they are fake, investigators say. As the country scrambles to address the threat, experts have noted that in South Korea, enthusiasm for new technologies can sometimes outpace concerns about their ethical implications.
But to many women, these deepfakes are just the latest online expression of a deep-rooted misogyny in their country — a culture that has now produced young men who consider it fun to share sexually humiliating images of women