Search

Facebook, now Meta, struggles to combat harassment in virtual reality - CNET

Sydney Smith had dealt with lewd, sexist remarks for more than a month while playing the Echo VR video game. But the 20-year-old reached her breaking point this summer.

Echo VR places players in the bodies of futuristic robots, allowing them to compete in a zero-gravity sports game that's similar to Ultimate Frisbee. Players, each identified by a username that floats above their avatar, split up into two teams and score points when they throw a disc through an opponent's goal. 

Oculus Quest 2 VR headset

The Oculus Quest 2 VR headset. 

Getty Images

In July, Smith was playing the game on an Oculus Quest 2 headset when she missed catching the disc and uttered the F-word out of frustration. A player, who'd been hurling insults at her teammates earlier, quickly took notice. The player taunted Smith, telling the Missouri resident that he'd recorded her and was going to "jerk off" to her cursing.

Smith tried to figure out which player had harassed her, so she could file a report. But that was tough because multiple people were talking at the same time. Since she hadn't been recording the match, Smith couldn't rewatch the encounter and look for a username.

"That really really bothered me," Smith said of the incident. "I couldn't touch the game for two weeks after that." 

Smith isn't the only virtual reality player who's had trouble reporting an ugly run-in. Though Oculus and Echo VR, both owned by Facebook, have ways to report users who violate their rules, people who've experienced or witnessed harassment and offensive behavior in virtual environments say a cumbersome process deters them from filing a report. Content moderators have to examine a person's behavior, as well as words. (Oculus' VR policy says users aren't allowed to follow other users against their wishes, make sexual gestures or block someone's normal movement.)

As Facebook focuses on creating the metaverse -- a 3D digital world where people can play, work, learn and socialize -- content moderation will only get more complex. The company, which recently rebranded as Meta to highlight its ambitions, already struggles to combat hate speech and harassment on its popular social media platforms, where people leave behind a record of their remarks. The immersive spaces envisioned by Mark Zuckerberg, the company's boss, will be more challenging to police. 

This story is partly based on disclosures made by Frances Haugen, a former Facebook employee, to the US Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. A consortium of news organizations, including CNET, received redacted versions of the documents obtained by Congress. 

"The issue of harassment in VR is a huge one," Haugen said. "There's going to be whole new art forms of how to harass people that are about plausible deniability." The tech company would need to hire substantially more people, and likely recruit volunteers, to adequately deal with this problem, she said. 

Facebook has more than 40,000 people working on safety and security. The company doesn't break down how many are dedicated to its VR platform. 

A well-known problem

An internal post from Jan. 28 that was part of Haugen's disclosures shows that Facebook employees are aware VR reporting systems fall short. 

In the post, an unnamed Facebook employee reports not having a "good time" using the social VR app Rec Room on the Oculus Quest headset, because someone was chanting a racial slur. The employee tried reporting the "bigot" but mentions being unable to identify the username. The employee reported exiting the virtual world "feeling defeated."

Rec Room users can report players by using their virtual wrist watch

Rec Room users can report players by using their virtual wrist watch. 

Screenshot by Queenie Wong/CNET

Rec Room is a good example of what the metaverse might become. The app allows people to dress up their avatars. Users can chat, create or play games such as paintball, laser tag and dodgeball with other Rec Room users. 

The Facebook employee, a first-time user of Rec Room, doesn't specify why there was difficulty identifying the speaker. Rec Room's avatars display lines to indicate they're speaking. Users can also look up each other in the app's people tab and initiate a vote to kick someone out of a room. To mute another player, an avatar holds up a hand. 

In an email, Rec Room CEO and co-founder Nick Fajt said a player using the same racial slur was banned after reports from other players. Fajt believes the banned player is the same person the Facebook employee complained about. 

"We want Rec Room to be a fun and welcoming experience for everyone, and we spend a lot of time building systems and growing our moderation teams to meet that goal. Still, there's always more work to be done here, and we plan to continue investing heavily to improve," he said.

The internal post prompted a thread of 106 comments. One Facebook employee said Rec Room ranked high in a survey that Facebook conducted to understand the "prevalence of integrity issues/abusive interactions at the app level." Another employee said Echo VR also ranked high in the survey.

"We see similar issues in Echo VR where while the user is able to identify the aggressor, at times those evaluating the abuse are unable to pinpoint who is saying what," a third employee said. 

Meta declined to make the survey available to CNET.

Bill Stillwell, Oculus' product manager of VR privacy and integrity, said in a statement that the company wants people to "feel like they're in control of their VR experience and to feel safe on our platform."

Users can report problems, and developers have tools to moderate their apps, Stillwell says. "But the tools can always improve," he said. "Our job isn't just to identify the tech that works for today, it's to invent entirely new tools to meet current and future ecosystem needs." 

Meta is exploring a way to allow users to retroactively record on its VR platform. It's also looking at the best ways to use artificial intelligence to combat harassment in VR, said Kristina Milian, a Meta spokeswoman. The company, though, can't record everything people do in VR, because it would violate privacy, as well as use up the headsets' storage and power. Andrew Bosworth, who will become Meta's chief technology officer, told employees in a March internal memo that he wants virtual worlds to have  "almost Disney levels of safety" but acknowledged that moderating users "at any meaningful scale is practically impossible," according to The Financial Times

Online harassment is still a big problem. Four in 10 US adults have experienced online harassment, and those under 30 are more likely to not only encounter harassment but also more serious abuse, according to a study released this year by the Pew Research Center. Meta declined to say how many reports Oculus has received about harassment or hate speech.

A 2019 study about harassment in VR from Oculus researchers also found that the definition of online harassment is highly subjective and personal but that the sense of presence in VR makes harassment feel more "intense." 

Brittan Heller, a lawyer at Foley Hoag and the founding director of the Anti-Defamation League's Center on Technology and Society, says the nature of VR will make moderating user behavior difficult. 

"The challenge with harassment in VR is its presence," Heller said. "It feels real, like a person is stepping next to you and saying and doing things that violate your personal space."

Toxic players

Facebook didn't develop Rec Room, but the game became available on the Oculus Quest in 2019 and the Oculus Quest 2 in 2020. It's available for other platforms, such as Microsoft Windows and PlayStation, so Quest users might be interacting with Rec Room users on other services. Oculus offers a way to report users who violate its rules. But it can take action only against users on its platform. It can't, for example, disable an account from another platform, like Xbox or PlayStation.

A scene from Echo VR

In an Echo VR lobby in November, a user (left) tells another player using a voice changer to "Go shove your dick down your throat."

Screenshot by Queenie Wong/CNET

To understand Echo VR's and Rec Room's harassment problems, I put on an Oculus Quest headset and visited both virtual worlds in November. It wasn't long before I encountered toxic behavior. 

In an Echo VR lobby, I overheard one player telling another user who appeared confused to "Go shove your dick down your throat," before the abuser vanished completely. On another day, I heard two players calling each other names, but I was too far away to read their usernames. As I flew closer, they flew away, making it tough to see who was speaking, even when a sound icon appeared above a robot's head. 

In Rec Room, a player started shooting others, including my female avatar, with confetti and shouting, "You're gay now!" Other users in the Rec Center reported him for violating the app's rules.

Part of the challenge with VR is that players are new to the environment. They have to learn how to hold an object and move. It isn't immediately obvious how to report abusive players or find safety tools. 

Both games encourage users to be nice to one another, displaying posters with their code of conduct. They also have moderators. Echo VR teaches new users how to mute other players and set up a personal bubble. It's also possible to change the pitch of your voice so you can disguise your gender. The game doesn't include a tutorial for reporting other players, though information is available online in a blog post. Rec Room has YouTube tutorials for reporting and muting players. 

In November, Rec Room started testing automatic voice moderation. In a blog post, Rec Room says users might start noticing that the "person yelling racial slurs quickly gets their mic muted, or the person making explicit sexual statements to everyone around them gets sent back to their dorm."

Policing gestures, behavior 

Jason Lemon, a 36-year-old in Texas, said verbal harassment and racism in virtual reality don't feel different from similar bullying in console gaming

A control panel in Echo VR

Oculus asks users to provide a username and video evidence if you're submitting a report of abuse.

Screenshot by Queenie Wong/CNET

"What makes Echo VR different is the type of mannerisms and body language that you can also put off," said Lemon, who is Black. He thinks that personal bubbles should be automatically activated after goals are scored, a period of dead time when he's seen players hump others or make sexual gestures. 

Theo Young, 17, said he started noticing more toxic behavior, including homophobic language, in Echo VR's social lobbies last spring. Young, who's played enough Echo VR to reach the game's top level, said he stopped playing when he saw other players harassing a female player. The Iowa resident said he tried to tell the female player how to mute or ghost others but she couldn't hear him among all the screaming users crowding around her and making sexual comments.

"That's the part that got to me. Just seeing other people have such an awful time," Young said. "I dropped off the game pretty hard after that experience. It just wasn't fun anymore."

As for Smith, she thinks Echo VR should have a strike system, a way to identify players by pointing at them, or features to make the avatars look different so everyone doesn't look like the same robot. 

"Companies need to step up and come up with new ways to moderate and help us," she said, "because we're the ones getting harassed."

Adblock test (Why?)

Article From & Read More ( Facebook, now Meta, struggles to combat harassment in virtual reality - CNET )
https://ift.tt/3ydbR3G
Technology

Bagikan Berita Ini

0 Response to "Facebook, now Meta, struggles to combat harassment in virtual reality - CNET"

Post a Comment

Powered by Blogger.