Activision has released the initial findings of its AI moderation tool, ToxMod, which was introduced last year for the video game Call of Duty. Since its implementation in August, over 2 million accounts have faced in-game enforcement for disruptive voice chat. ToxMod can detect disruptive comments across 14 different languages in Call of Duty games Modern Warfare II, Modern Warfare III, and Warzone. The tool allows the company to take action against players who violate the Code of Conduct. Activision noted that only 1 in 5 players report disruptive behavior, so active reporting is crucial. The AI model has led to a reduction in repeat offenders and a 50% decrease in severe instances of disruptive voice chat. The company is also encouraging players to follow the updated Code of Conduct, which includes zero tolerance for toxic speech. Activision is committed to combating toxicity and ensuring a fair and fun gaming experience for all players. Read more AI-generated news on: https://app.chaingpt.org/news