Sponsored By

Growing Online Communities Through Proactive Voice Moderation

Proactive voice moderation is vital for safeguarding online experiences in games, and key to helping game studios both protect their players from toxicity and create healthy, thriving communities.

Game Developer, Staff

December 15, 2023

4 Min Read
ToxMod Logo
via Modulate

Game developers spend endless hours crafting and refining experiences to ensure their players walk away with a strong and positive impression of their game. And rightfully so; at the end of the day, player experience can make or break even the most promising titles. However for online games, so much of that lasting impression rests on the shoulders of online interactions between players—interactions developers often feel fall beyond their control.

When we talk about content moderation, especially in voice chat-enabled online games, so many of the common solutions focus on reactive approaches like player-driven reporting tools or retroactive bans for offending users issued hours to days after an incident has occurred. These approaches can reduce the chance of future toxicity from repeat offenders, sure, but in many cases the damage has already been done and that first offense has soured a new player on your game, or worse.

Proactive voice moderation is vital for safeguarding online experiences in games, and key to helping game studios both protect their players from toxicity and create healthy, thriving communities. 

This is a driving principle behind ToxMod, a real-time voice-native moderation solution from prosocial voice technology expert Modulate. By connecting developers with tools like ToxMod and further resources on moderation and Trust & Safety, Modulate strives to create safer and more inclusive online communities for gamers and developers alike.

Related:How proactive voice moderation can solve gaming's toxicity crisis

Left Unchecked, Toxicity Runs Rampant in Online Communities

In the current online landscape, it’s more likely than not that players will experience some kind of toxic or hostile behavior in-game at one point or another. An ADL study conducted in 2022 concluded that 77% of players have experienced severe toxicity online, an increase of more than 10% in only a matter of three years. Members of marginalized communities are unfortunately more likely to find themselves on the receiving end of this harmful behavior.

If not properly addressed, in-game toxicity can quickly fester and impact a game’s reputation, and the lives and wellbeing of the individuals within its player base. And, if continually left unchecked, even unintentionally fostering a space for harmful interactions can lead to fines or even lawsuits levied against studios and companies. 

In order to truly foster safe, diverse, and enjoyable communities within games, content moderation needs to be front of mind for game makers.

The game industry is very familiar with this looming toxicity problem. However, most existing systems are only equipped to deal with incidents after the damage is done, and given that fewer than 10% of incidents are typically reported by players, it's clear that reactive moderation isn’t enough to protect growing communities on its own.

Empowering Game Developers to Protect Players

The tools developers have at their disposal need to be able to work in tandem with studios’ own community management teams and unique codes of conduct, while also accounting for the nuanced online interactions that can differ from title to title. 

Likewise, the factors that define toxic behavior differ from game to game, community to community, and language to language. Thoughtful consideration and understanding of these nuances are vital for any tool to support the growth of healthy and diverse communities. 

Through research, experience, and strategic partnerships, Modulate has positioned its team and technology to provide flexible solutions that empower game developers to effectively enforce their codes of conduct at scale, protect growing communities of players from toxicity, and to improve online safety throughout the game industry.

This approach enables ToxMod to act as the cornerstone of a studio’s holistic community management strategy. By leveraging machine learning, ToxMod uses proactive triaging models to analyze conversations in real-time and identify potentially harmful situations within in-game voice chat. 

These potential offenses are then escalated to moderators, but not before ToxMod conducts additional analysis that documents the severity of the situation by accounting for factors like context, slang, cultural norms, and the history between participating players. ToxMod extends this same complex understanding into support for 18 total languages by considering factors like regional colloquialisms, dialects, and social norms into its analysis to ensure effective, multilingual moderation.

ToxMod: Proactive, Scalable Voice Moderation Technology

To date, ToxMod’s proactive voice moderation has been adopted by games like Breachers, RecRoom, and other titles across numerous genres and audiences. In a testament to its scalability, Activision has just this August integrated ToxMod into Call of Duty: Modern Warfare II and Call of Duty: Warzone, as well as the recently released Call of Duty: Modern Warfare III, with Activision CTO Michael Vance hailing the partnership as “a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.”  

From indie teams to triple-A studios, ToxMod aims to support developers no matter where they fall in the industry. Companies that integrate ToxMod into their games and moderation processes typically see exposure to toxicity drop by between 25-33%, and new player retention rise by 7-14% as a result.

As a company, Modulate’s commitment to supporting the wider landscape of online safety throughout the game industry extends beyond the strides made by its technology. Find more resources on protecting online communities on Modulate’s website, or sign up for Modulate's Trust & Safety Lately newsletter to get an industry digest on all things content moderation, regulation and compliance, and trust & safety in the gaming industry.

Read more about:

Sponsor Resource Center
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like