A new white paper published by the UC Berkeley Center for Long-Term Cybersecurity (CLTC) calls upon Meta and other technology-makers to establish stronger, clearer guidelines to reduce the potential harms of “social virtual reality,” or social VR. Unlike Facebook, Twitter, and other “two-dimensional” (2D) social media platforms, social VR allows users to appear as avatars that can move, gesture, and speak directly with others, opening the door to new forms of harassment, including verbal abuse and groping.
“Social VR platforms should proactively develop community guidelines that maintain the standards of accessibility, comprehensiveness, specificity, and transparency used by 2D social media platforms,” wrote the report’s author, Rafi Lazerson, who conducted the research as an Alternative Digital Futures Researcher with CLTC. “Platforms should develop the community guidelines for the unique experience of immersive, conduct-based interactions in social VR. If applying community guidelines from 2D social media to social VR, detailed playbooks are needed to clarify how the principles and policies apply to unique forms of content and conduct in social VR.”
Platforms like Meta should act with urgency, Lazerson argues, because social VR is still nascent but growing rapidly. “Harassment in social VR is likened to in-person harassment, is likely pervasive, and impacts individuals from marginalized communities disproportionately,” he wrote. “Without immediate action to review and design effective community safety practices, including community guidelines, social VR stands to exacerbate inequalities rather than expand opportunities for inclusive positive interaction.”
For his research, Lazerson, who earned a Master of Public Affairs (MPA) degree program from the UC Berkeley Goldman School of Public Policy, conducted a review of academic literature, think-tank publications, and media reports related to social VR. He closely examined the existing guidelines of 2D social media platforms, and conducted an in-depth analysis of Meta’s community guidelines related to social VR, including the Horizon Policy, Conduct in VR Policy, and Facebook Community Standards. He also attended several webinars related to social VR, as well as the 2022 Augmented World Expo (AWE). (Read a Q&A with Lazerson about this project.)
Based on this research, Lazerson details the need for community guidelines in social VR to match the baseline for thoroughness now established in community guidelines for 2D social media. The paper focuses primarily on the social VR community guidelines of Meta, the company formerly known as Facebook that was renamed in 2021 to emphasize its strategic focus on the metaverse, and that markets the popular Meta (formerly Oculus) Quest VR head sets.
Ultimately, Lazerson identifies four key areas for Meta and other social VR platforms to focus on: providing increased clarity around when and where different community standards apply; increased comprehensiveness, as the current policies do not address the breadth of potential harms of social VR; increased specificity, as Meta’s current policies are often ambiguous about what is considered a violation; and increased transparency, as the guidelines lack policy rationales, a record of changes, or an explanation of values.
The paper details specific recommendations for how Meta can ensure the development of a secure and equitable metaverse. “The goal of this paper,” Lazerson wrote, “is to support a digital future in which social VR platforms proactively include robust community guidelines that clearly delineate prohibited online harms, foster inclusive user norms, and provide the public and end-users with transparency into their moderation practices.”