Community Safety Problems Don't Have To Stay One

Every virtual world has issues with safety management. There is always inevitably a toxicity problem, a harassment problem, and a minimum age problem. This goes back to every platform I've seen, and will always challenge whatever new digital space arises.

But when the news of VRChat's latest kidnapping appeared, I was nervous. Aaron Zeman, also known as Hunter Fox or Tadashi Kura Kojima, was arrested last month by the Grand Island Police Department in Nebraska for allegedly abducting a 13-year-old boy. Depending on which report you read of the arrest, there will be a mention of Roblox, Twitter, or VRChat as the place where the grooming is said to have happened. Zeman goes to trial on January 30th; more testimony might come out then to reveal the full story.

Virtual reality has unfortunately seen kidnapping cases before. With the most recent, however, there's an additional problem: the minor in question is the same gender as the kidnapper. On top of that, Zeman is allegedly a self-described furry. There has already been one article by the Daily Mail to seize upon this information. While what Zeman is charged with doing is reprehensible, the tone of the article is all too keen to lean upon the idea of a gay and furry kidnapper. And a source for incriminating tweets? Screencapped, according to the Daily Mail, with the convenient help of far-right speaker Andy Ngo.

Along the same line, a Youtube documentary was released on January 7th by Visual Venture, proclaiming to detail several safety concerns with VRChat. The documentary instead ends up hurting the community it aims to protect, swiping footage from several VRChat content creators. It's also quite bold in linking imagery of the platform's music venues with the implication of them all being intentful breeding grounds for predatory behavior.

An experienced eye knows this is agitative media packaged with a commercial between cuts. What the Daily Mail and Visual Venture are doing is essentially the same: take something that is true and needs addressing, and then stretch it a little and add some shock and outrage for clickbait. But this time, it's sacrificing VRChat's community, its content, and their love and hard work, and throwing them under the bus for something its administration is failing at enforcing.

It's also cheapening the subject. VRChat very much does have a minimum age/predatory user problem and a very bad harassment problem. A few months ago, I stopped by a virtual chess hall that's always populated. Most users there were lively and getting up to prank-level mischief if not locked in a game, but one of them was a child loudly proclaiming he was 8 years old. I told him, aghast, that he shouldn't be in VRChat at all. He looked back at me and said he didn't care.

I filed a report and wondered if it would go through. I didn't have any proof. I just knew what I saw and what I had heard. Most users don't record their play sessions because it's too cumbersome. You shouldn't need to do that in order to play chess and have fun. I've also experienced worry over abuse reports that I've filed due to harassment. How much proof is enough proof, and at what point does a safety team talk about what to do when the harassment is more covert?

Right now, I have a feeling VRChat is trying to keep up with a growing userbase. It might not have enough resources for increasing demands, and its original safety policies might have been at its best when the userbase was smaller. This means there should probably be an overhaul of some of those policies to become more effective. I acknowledge that's going to take a lot of meetings and brainstorming.

I have ideas, but they aren't perfect. One is a select intervention team for when users extend harassment beyond VRChat itself. That kind of intervention is probably going to have to contain an informational packet for users to understand what to do when someone is harassing them across multiple programs. Even the existence of that packet alone can bring peace of mind to harassment victims, and empower them with knowledge on what they should do if they need extended help.

Another idea is to be more firm on what kind of evidence abuse reports take. Are tweets permissible as evidence? Discord logs? Is a video of either permissible to validate its existence? What can users seeking help do to make the process better?

And then there's the minimum age--it's got to change to at least 16 and enact an age verification system. It might lose the platform some investor money, but that money would be lost sooner or later with enough crime reports. Crime can still happen with older users, but I feel the three cases mentioned are repeating a similar age between victims for a reason. Prevention is a better measure.

The final suggestion is a recognition that community policing doesn't always work. If enough players in a community are toxic and fail to hold one another accountable for their actions, it's another wash. Communities can build, provide content, and breathe life into a platform, but tend to fail spectacularly when the time comes to support one another in the face of bullying or abuse. Self-interest tends to throw a wrench in things pretty effectively.

Platform safety is a huge undertaking that doesn't change overnight. Still, it's better to have open conversations and try to think of ideas to better serve everyone.

For podcasts that explore further this topic in different forms, we recommend from Voices of VR:

#1057: What Parents Should Know about Social VR, Understanding Social VR Harassment, & Parental Guidance for the Metaverse with Lance G. Powell, Jr. – Voices of VR
#690: Survey of Harassment in VR: Cultural Dynamics vs Tech Solutions – Voices of VR
#789: Human Rights in the Metaverse: Brittan Heller on Curtailing Harassment & Hate Speech in Virtual Spaces – Voices of VR