Skip To Main Content

Anti-Toxicity Progress Report – Voice Chat Moderation

Call of Duty’s anti-toxicity team details efforts to reduce disruptive in-game behavior

Anti-Toxicity Progress Report – Voice Chat Moderation

Call of Duty’s anti-toxicity team details efforts to reduce disruptive in-game behavior

Call of Duty is taking the next leap forward in its commitment to combat toxic and disruptive behavior with in-game voice chat moderation beginning with the launch of Call of Duty®: Modern Warfare® III this November 10th. Activision will team with Modulate to deliver global real-time voice chat moderation, at-scale, starting with this fall’s upcoming Call of Duty blockbuster.

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more. This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.

An initial beta rollout of the voice chat moderation technology will begin in North America on August 30 inside the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone™, to be followed by a full worldwide release (excluding Asia) timed to Call of Duty: Modern Warfare III on November 10th. Support will begin in English with additional languages to follow at a later date.

Read the Call of Duty Voice Chat Moderation Q&A

Since the launch of Modern Warfare II, Call of Duty’s existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts detected to have violated the Call of Duty Code of Conduct. Consistently updated text and username filtering technology has established better real-time rejection of harmful language.

In examining the data focused on previously announced enforcement, 20% of players did not reoffend after receiving a first warning. Those who did reoffend were met with account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions. This positive impact aligns with our strategy to work with players in providing clear feedback for their behavior.

As part of the collaboration with our partner studios, the anti-toxicity team also helped add Malicious Reporting to the Call of Duty Security and Enforcement Policy to combat a rise in false reporting in-game.


Read more about the Malicious Reporting policy update

This type of commitment to the game and the Community from our players is incredibly important and we are grateful to our community for their efforts in combating disruptive behavior. We ask our Call of Duty players to continue to report any disruptive behavior they encounter as we work to reduce and limit the impact of disruptive behavior in Call of Duty.

Teams across Call of Duty are dedicated to combating toxicity within our games. Utilizing new technology, developing critical partnerships, and evolving our methodologies is key in this ongoing commitment. As always, we look forward to working with our community to continue to make Call of Duty fair and fun for all.



© 2023 Activision Publishing, Inc. ACTIVISION, CALL OF DUTY, CALL OF DUTY WARZONE, and MODERN WARFARE are trademarks of Activision Publishing, Inc. All other trademarks and trade names are the property of their respective owner. 

For more information on Activision games, follow @Activision on Twitter, Facebook, and Instagram.

We're Hiring

Check Out Job Opportunities At Our Studios

Join Us

Our Teams

  • Game Design
  • Art & Animation
  • Brand Management
  • Production
  • Quality Assurance
  • Customer Support
  • Studio Operations
  • Programming
  • Finance & Accounting
  • Human Resources

「軟體授權與服務合約」將會更新。請點選此連結 [] 以查看這些變更。