Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

IN FOCUS

A young girl’s avatar was assaulted in the metaverse – what should be considered a crime in VR worlds?

Metaverse developer’s joyful immersive universe has taken a dark turn, and the harm caused is real – Ellie Muir explores how virtual worlds are rife with rape culture and misogyny, and asks how we are going to make them safe to experience

Tuesday 30 January 2024 06:12 GMT
Comments
‘Even though something may happen online, they’re real emotions ... the trauma can be experienced as real,’ says psychologist Dr Daris Kuss
‘Even though something may happen online, they’re real emotions ... the trauma can be experienced as real,’ says psychologist Dr Daris Kuss (Getty)

Remember when Meta launched that futuristic campaign in October 2021, when Mark Zuckerberg, in avatar form, showed us around his bright and lavish virtual mansion? At the time, the company had just rebranded from being Facebook, and the social media supremo delivered an hour-long keynote explaining why the metaverse promises us a bright and exciting future. “We’re going to be able to express ourselves in new joyful and completely immersive ways,” Zuckerberg said, walking past a wardrobe of virtual costumes designed for his avatar. Then, rather bizarrely, Nick Clegg, who is the company’s vice-president of global affairs and communications, popped up on screen to discuss privacy settings. “The next version of the internet can deliver that feeling of deep connection and presence – that’s what the metaverse is all about,” added Zuckerberg.

Whatever predictions Zuckerberg was making about the future of the metaverse back in 2021, it still seemed as though it was something far away from the daily lives of us mere mortals – or so we thought.

Well, as of late 2023, research shows the metaverse is in full swing. One study shows that 66 per cent of UK children have engaged in virtual reality, while 25 per cent use virtual reality (VR) every week. In June, Meta widened its pool of consumers by lowering its age limit for metaverse access from 13 to 10 years old. Meanwhile, some VR app developers have already reported a sixfold increase in sales since Christmas Day in 2023. But as the number of metaverse users increases, the cheerful language used by tech companies starkly contrasts a darker reality.

On 2 January, it was reported that British police were investigating a virtual sexual assault on a girl’s avatar, the first time that UK authorities have looked into a case of this kind. We don’t know the details of what happened, only that the girl was under the age of 16 during the incident, and that she was playing an immersive game using her VR headset when her avatar was raped by several others. Though women have been trying to raise alarm bells about the rape culture of the metaverse and the misogyny often exhibited by anonymous actors in it, this case is entirely unprecedented when it comes to intervening after sexual abuse has occurred in a three-dimensional embodied world.

The metaverse is a collective name given to these virtual embodied worlds, in which users are represented by their avatars. It can be accessed by strapping on a VR headset, allowing you, in avatar form, to move, communicate and interact with other humans in virtual form. In the metaverse, you can go shopping, fishing, attend team meetings or have a virtual beer at a stadium concert. There are, of course, many new and exciting benefits that come with this brave new world, but experts are increasingly vocal about how unregulated these VR worlds are. As yet, there is little accountability for people who break the – often very loosely enforced – anti-abuse policies outlined by the platforms.

Nina Jane Patel, a London-based psychologist and metaverse researcher, was one of the first adults to publicly share her experience of having her avatar sexually assaulted. Using her Meta Oculus 2 VR Headset in the comfort of her own home, she accessed Meta’s social VR app Horizon Worlds when she says her avatar was “gang-raped” within a minute of being present in the room. “I don’t even think it had been one minute… I had four male avatars verbally harassing me, and then proceeding to assault my avatar,” Patel tells me. One man yelled: “Don’t pretend you didn’t love it,” while another said, “Go rub yourself off,” even though Patel repeatedly told them to stop. “I’d move to another area of the room but that didn’t stop them,” she says. “I was disrespected and assaulted. They were relentless, and just continued becoming more and more aggressive.”

While virtual rape doesn’t involve penetration, since avatars don’t have genitals, users can simulate sex and oral sex by taking provocative positions using their avatars. That, coupled with using their own voices to make sexually suggestive and disturbing noises, means that avatars can still approach others in a non-consensual and invasive manner. Given the advancement of headset technology, plus the use of haptic sounds and sensations, women who have reported being sexually assaulted in the metaverse have said how real it can still feel, despite it not involving physical bodily touch.

Dozens of women have come forward with similar stories. A female researcher from the campaigning non-profit organisation SumOfUs said her avatar was assaulted while in Horizon Worlds. “About an hour into using the platform, a researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see,” states a report by the organisation. “Another user in the room watched and passed around a vodka bottle.”

‘Even though something may happen online, they’re not virtual emotions, but real emotions and therefore the trauma can also be experienced as being real,’ says psychologist Dr Daria Kuss (Getty)

Several studies report that metaverse spaces lack regulation when it comes to monitoring sexual assault and safeguarding children. A recent report by the NSPCC found that technology companies are “failing to prioritise child safety on platforms” where child sexual abuse is happening. Richard Collard, associate head of child safety online policy at the NSPCC told The Independent that online sexual abuse “has a devastating impact on children” and that in an immersive world, where senses are intensified, “harm can be experienced in very similar ways to the real world”.

There is not a common understanding within that profession and there’s not really clear guidance about how to prosecute someone who is found guilty of a crime in this context

Dr Verity McIntosh, researcher and senior lecturer

Despite numerous reports proving that rape culture often underpins the environment in many VR spaces, eyebrows are raised at the legitimacy of what we can classify as “rape” or “abuse” in the metaverse. And there seems to be an obvious question here: Why, if these players were uncomfortable in these VR spaces, did they not remove their headsets and turn the game off altogether?

Well, my answer would be that if a woman or girl cannot sit down in what is probably the safest place she can be – her home – to play a game she enjoys, what does that say about the way misogyny and abuse exist on these platforms?

Dr Verity McIntosh is a researcher and senior lecturer in virtual and extended realities at the University of West of England, and has co-authored seminal reports on the subject, including one for the NSPCC entitled: Child Safeguarding and Immersive Technologies, and the Institution of Engineering and Technology’s (IET) report: Safeguarding the Metaverse. McIntosh tells me that the psychological impact of an assault happening in VR should be taken seriously since the experience can trigger similar feelings to that of an in-real-life (IRL) non-virtual incident. She explains this through the “process of embodiment” – when our online representation has connections with our actual self, which can be intensified if you use an avatar that looks similar to you, or if you use your own voice to speak to individuals in those VR worlds.

The more sophisticated the technology, the hazier the boundaries become between what feels real and what is not. “It’s a live interaction, and it feels personal. You speak with your natural voice, your [virtual] movement follows your physical movement and so with those kinds of psychological cues [you think] this is a real person, and we very quickly adopt the kinds of behaviours that we would in a normal social situation,” explains McIntosh. “And just like in any sort of physical social situation when somebody violates codes about personal space… while it is experienced differently [in the metaverse] it’s still incredibly profound and can be psychologically affecting in a way that we’re not used to culturally.”

‘There’s lots of children mixing with adults in these spaces and there’s not really any enforcement of rules,’ says Callum Hood, head of research at the Centre for Countering Digital Hate (Getty)

Dr Daria Kuss, associate professor in Psychology at Nottingham Trent University, tells me that if an individual has already experienced sexual harassment IRL, a virtual assault can remind them of that physical sexual assault and potentially “re-traumatise” them. While Kuss says there is no evidence to suggest that virtual assault can be seen as equal to the experience of IRL sexual assault, both instances can evoke similar emotions, and a person’s feelings after they are assaulted in virtual spaces should be validated. “Even though something may happen online, they’re not virtual emotions, but real emotions and therefore the trauma can also be experienced as being real.”

Mcintosh adds that the rebuttal of “Why didn’t you just turn off your headset?” often pulls the blame off the aggressor onto the victim. “There’s this idea that if you don’t like it... then don’t play with the big boys,” she says. “The fact that media reports and social media forums tend to direct blame towards the victims for not having their privacy settings [turned on], or somehow not anticipating the traumatic episode by taking their headsets off in time [moves] the spotlight away from the offending player and makes the victim seem kind of naive or bringing it ‘on themselves’.”

Meta-physical

I’m watching a recording of a male avatar making sexually suggestive noises as he follows a female avatar around a room in VRChat (the most-reviewed social VR app on Meta). He’s harassing her even though she’s asking him to stop. This time, it’s not happening to Patel, it’s happening to a female avatar in footage being shown to me by Callum Hood, head of research at the Centre for Countering Digital Hate (CCDH). In 2021, from his London office, Hood observed VRChat for 11 and a half hours and recorded more than 100 violations of Meta’s guidelines, finding one example of abuse every seven minutes.

He’s about to hit play. “Just to warn you, it’s very weird,” he says, before showing a video where one user projects hentai porn (sex performed by Japanese manga characters) across the room for all the users to see, without their consent. “I’d occasionally log into a space and there’d be a woman present, and they would effectively get mobbed by male users,” says Hood, citing gender-based harassment as a common occurrence in VRChat.

If a woman or girl cannot sit down in what is probably the safest place she can be – her home – to play a game she enjoys, what does that say about the way misogyny and abuse exist on these platforms?

Hood reported each violation to Meta, but to no avail – they all went unanswered. “Meta had its opportunity to act on these accounts that we identified and as far as we know, it has not done so,” he says. And there are flaws within Meta’s reporting procedure that add even more hurdles to the arduous complaints process. The CCDH discovered that a report won’t be accepted if Meta can’t recognise the usernames involved. So, if an aggressor changes their username, they can remain unscathed and continue to use Meta platforms freely. “It’s really important that users can navigate these chaotic environments in the heat of the moment if they need to report something,” adds Hood.

The CCDH launched another report in March 2023, this time looking at Meta’s Horizon Worlds’ failure to protect minors. Researchers logged into Horizon Worlds 100 times and found minors present in 66 of these instances and identified 19 incidents of abuse directed at minors by adults, including sexually explicit insults and racial, misogynistic and homophobic harassment.

“There’s lots of children mixing with adults in these spaces and there’s not really any enforcement of rules,” says Hood. In Meta’s code of conduct for virtual experiences, it tells users “do not promote anything that’s illegal, abusive or could lead to physical harm, such as sexualising, exploiting or abusing minors.” However, Hood says that if Meta were to make its regulations stricter and ban every user who violates its behaviour policies, it could be a “conflict of interest”, since Meta’s profit comes from the sale of apps and in-app features, rather than the sale of headsets. “If Meta removes a user from their platform, that’s someone that they can’t sell products to any more,” says Hood.

Only 34 per cent of women make up the labour force in US tech companies like Amazon, Apple, Facebook, Google and Microsoft, according to statistics from recruitment company Zippia (Getty)

When contacted by The Independent, a spokesperson for Meta outlined a range of “teen safety features” available. “Teen profiles are automatically set to private, so they’re able to approve or decline anyone who requests to follow them,” they said. Another feature, voice mode, transforms the voices of people the teen doesn’t know, to give them “more control over who can communicate with them”. It outlined a range of tips for parental safeguarding and added that Meta works with “over 500 women’s safety NGOs around the world through regional roundtables and during the UN Commission on the Status of Women to get ongoing feedback on our safety tools” to make all their spaces “welcoming and safe for everyone”.

Real solutions to a virtual problem

When it comes to the case of the assault of the underage girl’s avatar, it’s still not clear what laws will be drawn upon, and whether it will involve the Online Safety Bill – a year-old set of laws to protect children and adults online.

Already, though, there’s scepticism among experts about how far the bill addresses what has been widely coined as the metaverse’s “groping problem”. Upon the Online Safety Bill receiving its royal assent to become law in October 2023, McIntosh and several colleagues from the Institution of Engineering and Technology issued an open letter to Ofcom, in which they asked for an urgent review of how VR spaces are governed, “given the seriousness of the offences being committed”.

Still, prosecuting rape is never easy, and it is uncertain how this will translate into VR worlds. It’s widely known that women often face huge barriers when trying to report a rape case: more than 99 per cent of rapes reported to police in the UK and Wales do not end in conviction – imagine that difficulty in trying to bring about an assault case that happened virtually. “There is not a common understanding within that profession and there’s not really clear guidance about how to prosecute someone who is found guilty of a crime in this context,” says McIntosh. “There are also unresolved issues that the government could take a position on so we’re having to find things out as we go along.”

Could there ever be a solution to the metaverse’s groping problem? Martina Welkhoff is a founding partner of WXR, a Seattle-based fund that invests in women-led start-ups in VR and AR (augmented reality). She tells me that the scarcity of women working in the programming of immersive apps could explain why women often feel pushed out of these environments.

Only 34 per cent of women make up the labour force in US tech companies like Amazon, Apple, Facebook, Google and Microsoft, according to statistics from recruitment company Zippia. “When people in the room aren’t able to recognise those vulnerabilities and the way that women might have a very different user experience than men, it causes harm or poses a potential danger, particularly in the social setting,” says Welkhoff, who is well aware of the benefits of having more women pioneering VR apps since her job is to invest in them. “We’ve noticed that if companies are founded by a diverse group of folks, whether that’s women and men together or people of colour and under-represented voices more broadly, they tend to build out more diverse teams, and they tend to outperform the competition,” says Welkhoff. “I just think it’s a huge strategic mistake not to proactively make sure women are a big part of these early-stage teams.”

The next generation of children will spend an estimated 10 years in virtual reality throughout their lifetimes, which is close to three hours a day, according to the IET. Of course, the way lawmakers adapt to this changing form of interaction will present itself soon. But in the meantime, it remains to be seen whether developers can make their platforms match up to their cheerful marketing.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in