Home Business How AI Voice Moderation Reduced Toxicity in Call of Duty by 50%

How AI Voice Moderation Reduced Toxicity in Call of Duty by 50%

Call of Duty has long been known for its toxic player base, but Activision and Modulate have teamed up to address this issue using AI voice moderation. The partnership has resulted in a significant reduction in toxicity exposure in voice chat in games like Call of Duty: Modern Warfare II and Call of Duty: Warzone in North America. In fact, toxicity exposure was reduced by 50% in these games. Additionally, the newest game, Call of Duty: Modern Warfare III, saw a decrease in repeat offenders and a 25% reduction in exposure to toxicity. These improvements have led to an overall better experience for players and improved player retention.

Toxic behavior in online gaming is a widespread problem, affecting not only individual gameplay experiences but also the sense of camaraderie and respect within gaming communities. Prior to the integration of ToxMod, an AI screening technology developed by Modulate, a significant number of players had experienced severe harassment while playing video games. The introduction of ToxMod has allowed for more proactive moderation, reducing toxicity exposure rates and improving player engagement.

ToxMod works by analyzing voice communications in real-time, using machine learning to differentiate between competitive banter and genuine harassment. It focuses on violations of Activision’s Code of Conduct, such as racial or sexual harassment, rather than minor profanity. By combining AI detection with human moderators, ToxMod ensures a more comprehensive and efficient moderation process.

The initial analysis of ToxMod’s impact showed that a large percentage of toxicity exposure came from first-time offenders who were already active in Call of Duty. Furthermore, player-generated reports addressed only a small fraction of the violations, highlighting the limitations of reactive moderation methods. With ToxMod, Activision was able to identify and take action against offenders who may have gone unnoticed otherwise.

The integration of ToxMod into Call of Duty has had a significant positive impact on player engagement. Proactive moderation efforts led to an increase in the overall number of active players, with the treatment group seeing more new players and players who had been inactive for a period of time. The longer the moderation efforts went on, the greater the positive impact on player engagement.

The global launch of Call of Duty: Modern Warfare III further demonstrated the effectiveness of proactive moderation. There was a significant reduction in players exposed to severe instances of disruptive voice chat, a decrease in repeat offenders, and an increase in moderator enforcement of the Code of Conduct. ToxMod enabled Activision to catch and address harmful content without burdening players with the responsibility of reporting.

The integration of ToxMod into Call of Duty sets a new benchmark for proactive moderation in the gaming industry. By prioritizing real-time intervention and fostering a culture of respect and inclusivity, Activision is not only enhancing the gaming experience for its players but also leading by example for other game franchises. The success of this partnership has garnered interest from other game studios, researchers, and industry regulators who recognize the importance of addressing toxicity in gaming.

Overall, the integration of ToxMod has proven to be a win-win situation for both players and studios. Players have expressed gratitude for the improvements in the gaming experience, leading to increased player engagement and retention. The case study results have provided concrete evidence that trust and safety benefit both players and studios, paving the way for a more positive and inclusive gaming environment.

Exit mobile version