Advertising

Creating a Safer and Happier Gaming Community: How Rec Room Reduced Toxicity and Improved Player Experience

blank
Reducing Toxicity in Online Gaming: The Success Story of Rec Room’s Trust and Safety Team

Introduction
Ensuring player safety and a positive gaming experience is a top priority for game developers. In a recent VB Spotlight, Yasmin Hussain, the Head of Trust and Safety at Rec Room, and Mark Frumkin, the Director of Account Management at Modulate, discussed how they tackled toxicity in online gaming and implemented effective voice chat moderation strategies.

Rec Room: A Social Gaming Platform
Rec Room, a social gaming platform launched in 2016, boasts over 100 million lifetime users. The platform allows players to interact in real time through text and voice chat across various devices, including PC, mobile, VR headsets, and consoles. Trust and safety play a critical role in maintaining the platform’s integrity and creating a positive gaming environment.

The Challenge of Toxicity
Real-time voice chat in gaming often leads to instances of toxic behavior. Hussain and her team at Rec Room recognized the need to address this issue and change the behavior of players who violated community standards.

Taking Steps Towards Change
To combat toxicity, Rec Room’s trust and safety team implemented several strategies. First, they extended continuous voice moderation coverage to all public rooms, setting clear expectations for behavior. They also experimented with different forms of intervention, including warnings and mutes. Through these tests, they discovered that instantly detecting violations and imposing a one-hour mute had a significant impact on reducing bad behavior. This real-time feedback not only changed player behavior in the moment but also kept them engaged in the game.

Targeting Specific Violators
Analyzing the data, the team found that a small percentage of players were responsible for the majority of violations. To address this specific cohort, they introduced stacked interventions. Instead of repeating the same warning or mute, they combined multiple interventions to reinforce the consequences of toxic behavior. This approach yielded positive results, further reducing violations.

Metrics and Experimentation
To iterate on player moderation strategies, Frumkin emphasized the importance of tracking specific metrics, such as the prevalence of toxicity, the profile of rule-breakers, and the frequency of code of conduct violations. Clear hypotheses, expected behavior changes, and desired outcomes were crucial in designing effective interventions. Iteration and allowing experiments to run long enough to gather sufficient data were key to refining the strategies.

Continuous Improvement
While Rec Room has achieved significant success in reducing violations and improving player experience, they acknowledge that there is always room for improvement. The team continuously evolves their moderation strategies to address new challenges as they arise. Real-time speech moderation remains a challenge, but AI-powered tools like ToxMod have made a tremendous impact in enhancing moderation efforts.

The Future of AI-Powered Voice Moderation
ToxMod, an AI-powered voice chat moderation solution, continuously analyzes data to identify policy violations and toxic language. However, moderation should not only focus on discouraging negative behavior but also on encouraging pro-social behavior. Identifying positive role models within the community and amplifying their impact can create a safer and more enjoyable gaming environment.

Conclusion
Rec Room’s trust and safety team has achieved remarkable success in reducing toxicity and improving player experiences through effective voice chat moderation strategies. By leveraging AI-powered tools and constantly iterating on their approach, they have created a community where players feel safe, welcome, and engaged. The future of online gaming holds immense opportunities for leveraging machine learning and ensuring player safety.