<a href="https://www.thenationalnews.com/arts-culture/comment/2021/12/30/from-apple-to-meta-what-the-tech-giants-have-planned-for-2022/" target="_blank">Meta</a>, the company formerly known as Facebook, opened the doors in December to Horizon Worlds, a new virtual reality platform. Part of what it calls the “metaverse”, it’s billed as a meeting place for friends – represented as avatars – to meet, talk and interact while wearing VR headsets. But it didn’t take long for this parallel society to start mirroring the problems that exist in the real world. Within weeks, a number of women had come forward to complain of sexual harassment and assault in Horizon Worlds and its sister platform, Horizon Venues. Meta’s community standards do not permit such behaviour, but the difficulty of enforcing rules, coupled with the anonymity and nascent social structures of VR, have posed an important question: can such spaces ever be safe for women? When words such as assault and harassment are used to describe behaviour in virtual environments, there’s an assumption that such incidents must be of lower magnitude than the real-world counterparts. However, the women's accounts make for harrowing reading. They describe being “groped” and even “raped” by male avatars while others look on without intervening. One spoke of having suffered from anxiety ever since the incident, while describing the “intensity” of the experience. The alternate reality constructed by Meta and other companies working in the VR and XR (extended reality) space is, by definition, supposed to feel authentic. The corollary of this, however, is that personal abuse feels authentic, too. “The neuroscience behind the way XR environments affect our cognition shows that it actually is real,” says Brittan Heller, a human rights advocate and VR expert. “The way that you and I experience an event in VR is interpreted by our brains in the same way as if you and I were sitting next to each other on the couch. So when companies regard abusive situations as being akin to reading obnoxious content on text-based social media, it's just not the same. It's like somebody sitting next to you in your living room and assaulting you. That's how your brain interprets it.” Meta’s initial response to these complaints came in for much criticism. A company representative said that one complainant had failed to use the tools available to her to block, mute and report offenders, or indeed use a “Safe Zone” feature to stop unwanted advances. "I think there's a tendency with platforms to blame the targets of abuse,” says Heller. “You've seen that from the early days of social media, where targets – especially women – were told to turn off their computer screen, and it's the same with VR.” Meta belatedly introduced a “personal boundary” option, turned on by default, which imposes a minimum distance of about one metre around your avatar. This, they say, “will help to set behavioural norms”. Critics have expressed bafflement that Meta seem to have failed to foresee the inevitable. More than five years ago, a woman named Jordan Belamire wrote of her experience of assault while playing a VR archery game, QuiVr, on the HTC Vive system. “I hadn’t lasted three minutes in multiplayer without getting virtually groped,” she wrote. “What’s worse is that it felt real, violating … As VR becomes increasingly real, how do we decide what crosses the line from an annoyance to an actual assault? ” Her words were prophetic, and yet new platforms are being developed and launched without proper consideration for the welfare of potential targets of abuse, be it sexual or racial. This can, in part, be attributed to the acknowledged lack of diversity within big tech, where users of new technologies are assumed to be white and male by default. But according to Heller, there’s the additional problem of the hype surrounding VR. "We're in the equivalent of the new space race,” she says. “Large companies are competing for your next personal communication device. If you understand the challenges of [VR] as a product of the race to replace your cellphone, it makes much more sense. You can see why a company would release something that didn't have well thought-through safety features, or a user education program in place.” It’s still not known whether, in the long term, society will embrace VR and the metaverse. It’s certainly off to a slow start; according to its last earnings report, Meta’s Reality Labs division lost more than $10 billion in the last calendar year. Its expansion certainly won’t be helped if half the world’s population are fearful of the consequences of using it. Moderating abusive behaviour in text-based environments already presents a challenge; in VR it might be close to impossible. “To process visual information about an avatar, or how close one is to another, [is] going to take up so much computer power,” said <i>Bloomberg</i> columnist Penny Olson in an interview with the BBC. “I don't know what technology can do that.” Heller concurs. “Right now there is no technological solution for these problems,” she says. “It's a social problem.” Some working in the metaverse have stated their desire to solve it, including Philip Rosedale, founder of Second Life, an early incarnation of the metaverse. “It’s possible to build a version of the metaverse that doesn’t harm people but actually can help with the problems we have now,” he told <i>Fast Company</i>. “Virtual worlds don’t need to be dystopias.” Heller also remains hopeful. “I love VR and XR,” she says. “For me, the promise and potential are mind-blowing. It's akin to splitting the atom. However, I think it's a failure of imagination if we just import the harms and structures of existing social media into a new format.”