If you regularly engage in the battlefield of Call Of Duty’s multiplayer, it is time to be more careful and become a positive player. Activision is introducing a new voice chat moderation feature for its current-gen and upcoming Call Of Duty game. This AI voice meditation tool will always be listening to players communicating inside Call Of Duty multiplayer matches, banning toxic players for their behavior.
Call Of Duty Modern Warfare 3 Will Feature ‘ToxMod’ AI Moderation
To reduce toxicity in the game, Call Of Duty’s dedicated anti-toxicity team is introducing ToxMod. What is it, and how does it work? As Activation states, it is an AI-powered voice chat moderation technology. This is developed by Modulate and is built on advanced machine learning technology. ToxMod will support English first, with additional languages confirmed to get support at a later stage. The full release of this new AI-based moderation software is coming to Call Of Duty: Modern Warfare 3 later this year.
Designed to result in better safety for the players, ToxMod actively analyzes each and every conversation to determine what counts as “toxic.” It will allow the moderators of Call Of Duty Multiplayer to swiftly act on reports, including the “red flags” highlighted by Modulate’s ToxMod AI moderation technology. Activision claimed ToxMod can detect all sorts of toxic behavior, including but not limited to:
- Hateful speech
- Discriminatory language
Is ToxMod Coming to CoD Warzone & Modern Warfare 2?
Yes, Activision is implementing ToxMod in the form of an initial beta rollout to their existing games, including Call Of Duty: MW2 and Call Of Duty: Warzone. This beta began on August 30, 2023.
So, if you play Call Of Duty multiplayer actively, you will be experiencing superior matchmaking moderation with ToxMod. Reports made by players can now be addressed earlier than ever before because ToxMod provides all the relevant context in relation to the report.
Since the launch of CoD: MW2, Activision claims to have used the ban hammer on over 1 million toxic players. Thankfully, after the initial restriction, 20% of players exhibited better behavior. Those who had their second offense were met with stricter restrictions. The dev continues to state that their efforts to reduce in-game toxicity are working well.
With the implementation of ToxMod, we hope to see better days in the online multiplayer Call Of Duty community. Even Xbox introduced a new Strike system recently. Microsoft aims to reduce toxicity within the entire Xbox network with this new system in place.
It is incredible to see game developers constantly implementing new advancements to curb the problem of online toxicity. With new AI tech such as ToxMod, we could see a dramatic reduction in toxic behavior. What are your thoughts on the new ToxMod update coming to Call of Duty games? Let us know in the comments below.