Comment

The deepfake Taylor Swift images prove sexual violence is still the weapon of choice against women

The US pop star is the subject of horrific, violating AI-generated images that have been circulated on social media. Roisin O’Connor writes how it shines a grim light on the lengths to which misogynists will go to degrade and humiliate women

Friday 26 January 2024 15:36 GMT
(Getty)

In September last year, Elon Musk was widely mocked after sending a public message to Taylor Swift.

“I recommend posting some music or concert videos directly on the X platform,” he said in an X post to the US pop star.

The grovelling display from the Space-X founder, who took over Twitter in April 2022, was embarrassing enough at the time (Swift was one of a number of celebrities he had attempted to court in a bid to get them posting on his platform). But if the “Shake It Off” singer wasn’t tempted back then, she certainly won’t be convinced after the events of this week.

On Thursday, explicit deepfake images of the 34-year-old were posted to X and circulated across social media, sparking dismay and outrage among her fans and many more besides. Deepfakes are artificially generated images or videos that depict a person’s likeness; Mashable reports that a study last year found that 98 per cent of all deepfake videos online are pornographic in nature. Ninety-nine per cent of deepfake targets are women.

It’s been a horrific week for Swift. Just a few days before the pictures were posted online, a man was arrested for the third time in just seven days after being caught outside Swift’s home (he has been charged with stalking and harassment). Being a woman in the public sphere is terrifying enough – there have been at least a dozen incidents where men had attempted to break into Swift’s home. Some were successful.

Taylor Swift "Furious" Over AI Generated Sexually Explicit Images

Swift has spoken about how she carries army-grade bandages for knife or gunshot wounds. “Websites and tabloids have taken it upon themselves to post every home address I’ve ever had online,” she said. “You get enough stalkers trying to break into your house and you [start] prepping for bad things.”

The fake pictures of Swift, which I have involuntarily seen and felt sickened by after clicking on her trending name last night, show her in sexualised positions while being groped and assaulted by hordes of Kansas City Chiefs fans, in reference to her relationship with the team’s tight-end, Travis Kelce. According to reports, one was seen more than 47 million times before X began removing them, but by this point, they were apparently also being found on Facebook, Instagram and Reddit.

Taylor Swift performing during her Eras tour (Copyright 2023 The Associated Press. All rights reserved)

Swift’s fans were spurred into action, desperately trying to help quell the abuse of their favourite artist. They shared images and footage of her on stage, at awards shows, on the red carpet – any image that wasn’t the ones she very obviously hadn’t consented to.

Ellie Wilson, a prominent sexual assault survivor and campaigner, was among the thousands to speak out against the pictures, but also examined why so many men were visibly engaging with them on a public platform.

“I think part of the reason so many men are loving these fake images of Taylor Swift being sexually assaulted is because it’s the humiliation and degradation of a strong and powerful woman that does it for them,” she wrote.

Amazon Music logo

Enjoy unlimited access to 70 million ad-free songs and podcasts with Amazon Music

Sign up now for a 30-day free trial

Sign up
Amazon Music logo

Enjoy unlimited access to 70 million ad-free songs and podcasts with Amazon Music

Sign up now for a 30-day free trial

Sign up

“They want to see her ‘put in her place’.”

Indeed, even in 2024, there are few things a misogynist society finds as infuriating as a woman who doesn’t answer to them. Their means of punishing them, then, is sexual humiliation and degradation – to send the message that, however rich, powerful or famous a woman might be, there are still ways to bring them down. Our bodies are not our own because, thanks to AI, they can be manipulated, changed, warped.

‘They want to see her put in her place’ (Copyright 2023 The Associated Press. All rights reserved.)

There were mutterings when Taylor Swift was crowned Time’s Person of the Year, that she had once again become too popular. Yes, it’s impossible to deny that it truly felt as though she was (and still is) everywhere, whether releasing chart-topping albums or performing sold-out shows around the US, or breaking records at the global box office with her concert movie.

Simply doing her job, and excelling at it in a way few other artists in history have, though, is apparently enough to warrant a backlash. The attitude among many seems to be that the obsessive media coverage of her career or relationship with Kelce is her responsibility, despite the fact that she has only given one sit-down interview in four years.

There have been plenty of warnings about the dangers posed by deepfake images – Swift is the latest victim but she won’t be the last. Other targets have included fellow pop singers, journalists, TikTok creators, influencers, and young women and girls. Right now, an inquest is taking place into the apparent suicide of British schoolgirl Mia Janin, 14, whose male classmates allegedly shared fake nudes of female pupils on Snapchat. Similar behaviour was exposed at a school in New Jersey last November.

Sharing deepfake porn has been illegal in England and Wales since June 2023, in a government bid to crackdown on “abusers, predators and bitter ex-partners who share intimate images online without consent of those depicted”. In the US, many states have still not updated their anti-revenge porn laws to include the use of technology in creating and sharing fake images. Clearly, companies such as X are not yet equipped to act with enough speed or efficiency when something like this occurs.

However protected Swift might seem, however luxurious or privileged her life appears, should have absolutely nothing to do with how we react to those pictures. They violate her most basic rights to privacy, respect, and autonomy over her own image – and no amount of privilege will ever justify that.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in