Login
MetaCritic
91
UserScore
8.4

The AI in games has just been useful for a while. According to Activision, Call of Duty has seen a 43% decrease in 'disruptive' voice chat since the beginning of the year. This is thanks to the robo-snitch.

AI in games is a weird subject--glassy-eyed, soulless neo-NPCs, hideous AI art slapped onto social media accounts and, thanks to the seemingly irresistible, gravitational allure of its power as a buzzword to sell software and tat, a lot of confusion as to what "AI-powered" actually means.

I'm a sceptic. While I don't think AI, and more specifically generative AI, has much to offer in terms of writing and art, I have to admit that Activision, of all the studios, might've found a good way to use it with Modulate.

I say could've because it's part of my job to be skeptical, and because Activision has a vested interest to make you believe that they make good decisions. According to its blog post on the toxicity report, the numbers are still supposedly looking pretty good.

Activision claims that since implementing an improved voice chat enforcement system in June 2024, "Call of Duty" has seen a combined reduction of 67% in repeat offenders of voice chat-based offences for Modern Warfare 3 as well as Call of Duty: Warzone. In July 2024 80% of the players who were issued a voice-chat enforcement since launch didn't reoffend. The exposure to disruptive voice chat is continuing to drop, falling by 43% since Jan 2024.

ToxMod is the AI software that Activision announced would be included in Call of Duty in August of last year. This software acts as a goodie two shoes narc who is always watching your games. It's not responsible to ban anybody, but will listen and report your flaming to a team that can make up for AI's habit of imagining stuff.

ToxMod can also distinguish between banter and real hatred. This is a very big claim that I urge you to treat with skepticism. It also has a severity-rating system.

It claims that certain slurs, such as the n-word, have been reclaimed in their communities. When it detects a slur, it will keep an eye on how other people react. For example, "If a person uses the n-word, and offends the rest of the chatters, this will be treated much more harshly than if it is reclaimed usage, which is naturally incorporated into a conversation."

Other common sense stuff is also important. If there's someone young-sounding in the chat, for example, offenses will be rated more highly. We adults can probably handle some bad language, but I'm sure we can agree that teaching little Timmy to insult my sexuality by putting his chest out is not good for our society.

While I don't say the situation is great, it's a bit beyond the pale. Your search history is used to tailor ads, your data from any social media platform can be sold to the highest bidder and if you own a smartphone your exact location dating back monthsis likely to be available. All of this is to say that I think an AI reporting system is probably not the most important thing we should be concerned about.

If Activision's figures are taken in good faith, then it's an excellent trade-off. Call of Duty has a reputation for being toxic when it comes voice chat. Whether that's true or not is another matter, but I think that games get a bad rep for this for good reason. It's a good idea to relieve the pressure on mod teams, and take the responsibility of reporting out of the players' hands, so they can concentrate on playing the game instead of navigating menus. This seems like a net benefit, and I reluctantly tip my hat.

Interesting news

Comments

Выбрано: []
No comments have been posted yet