AI Voice Chat Moderation is now available in Call of Duty. Along with the release of Call of Duty: Modern Warfare 3, the franchise is introducing its AI-powered voice chat moderation technology to curb toxic speech in-game. Call of Duty’s AI voice chat mod has been in testing since August in North America, but the complete global rollout has finally commenced, coinciding with the release of Modern Warfare 3.
AI Voice Chat Moderation is now available in Call of Duty
Toxic speech has long been a problem in the famous shooter franchise. Developer Infinity Ward vowed to remove abusive Call of Duty players from voice and text chat, but toxicity has remained an issue. With tens of millions of gamers checking in every day, it’s a large challenge to address, therefore Call of Duty officials have used new technology developments to prevent abusive language.
The voice chat moderation mechanism for Call of Duty: Modern Warfare 3 has gone online alongside the game’s huge fall release, Modern Warfare 3.
The AI-powered technology, known as “ToxMod,” has been in beta in Warzone and Modern Warfare 3 since late summer, but it is now being rolled out internationally across all three titles, according to the official Call of Duty blog. Modulate, a firm dedicated to combating harmful online behaviour, created the technology.
Call of Duty’s AI voice chat moderation is live worldwide (except Asia-Pacific) in MWIII, MWII, and Warzone.
AI moderation will report voice chat instances that break Call of Duty’s code of conduct, but AI cannot ban anyone. All reports are manually reviewed. pic.twitter.com/kqN3yolqb7
— CharlieIntel (@charlieINTEL) November 10, 2023
The moderation system was formally released globally, except in the Asia-Pacific region, with the release of Call of Duty: Modern Warfare 3. ToxMod filters both voice conversation and text, and the system can read 14 distinct languages.
The moderation team intends to extend into Spanish and Portuguese for in-game voice chat in the near future.
It should be noted that Call of Duty is not the only series suffering from online toxicity. Overwatch 2 gamers, for example, have requested stronger chat filters to curb toxicity. Nonetheless, it is not limited to one or two franchises.
This type of language is common in internet chat rooms both inside and outside of the gaming industry. With these latest moves, however, it appears like the Call of Duty team is taking some responsibility and working to make their ecosystem more welcoming to all gamers.
Call of Duty recently implemented a new code of conduct, which resulted in 500,000 suspensions. However, combating toxic behaviour will very certainly necessitate a continuing effort. Not to add that the buzz surrounding AI will almost certainly make this new move fairly contentious.
Even if done for humanitarian motives, the thought of AI-powered devices listening in on chats may irritate some players. After all, it has long been felt that something should be done about the toxicity of online gaming, and judging by the activities of the Call of Duty team, it appears that they agree.