Modulate: Reducing toxicity in online games is a positive for profits
How Modulate’s AI-assisted voice moderation tool ToxMod isn’t just efficient at tackling toxicity but results in increased user retention and spending
Toxicity is a serious issue that players expect studios of online games to address. New legal regulations are also demanding studios do more to protect their players or face hefty fines.
While it’s clear that there is a moral and a growing legal imperative to protect players from toxicity on an online platform, it’s also the right thing for studios to do when it comes to increasing an online game’s revenue.
Modulate CEO and co-founder Mike Pappas told GamesIndustry.biz how the company’s AI-assisted voice moderation tool ToxMod isn’t just the most efficient way to combat toxicity but that “effective content moderation helps to foster a positive and safe gaming environment, which directly improves player experience - and player retention.”
Positive and safe environments lead to increased spending
When live service games are dependent on a user base that spends money on the platform, then it’s more important than ever to ensure you’re not losing customers through churn, which can be the result of toxicity when left unchecked. This translates to the real world too; customers are unlikely to return to an establishment that feels unsafe and unwelcoming, and its reputation may further put off potential new customers.
“In the EU, the Digital Services Act can levy fines up to 6% of a company’s worldwide annual turnover for failing to implement user safety policies"
“If I have a bad experience, I probably churn unless the platform is proactive in demonstrating their commitment to fixing things,” Pappas says. “Not only are players who experience or witness toxicity more likely to churn, but even those who stick around may become disillusioned and stop submitting user reports, which further exacerbates the toxicity problem.”
But while a game studio might not see the necessity of addressing toxicity if their title is popular and compelling enough that players stick around in spite of it, a survey from Take This shows that 61% of players choose to spend less money in games due to experiencing hate speech or harassment. After all, why would you want to spend money in an environment that makes you feel bad?
“There used to be this dangerous myth that the toxic players and the ‘whales’ were one and the same, and that was a ‘justification’ for not combatting toxicity,” explains Pappas. He points to a 2023 study by Dr. Constance Steinkuehler, which showed that average monthly in-game spending by players was $12.09 in toxic titles but would double to $21.10 in comparable titles with safer and more inclusive communities.
The legal cost of toxicity
While content moderation at scale can seem difficult and costly, the cost of doing nothing is greater, especially with growing public attention on digital spaces, where safety is becoming an international concern.
“The image of an angry gamer screaming into their headset while playing an online FPS game often comes to mind when we talk about toxicity in gaming, but content moderation can and should go beyond that narrow image of toxicity,” says Pappas.
“Young players are particularly vulnerable to even more nefarious forms of harm like sexual grooming and grooming for violence - which are thankfully quite rare, but devastating enough for even a single case that lawmakers around the globe have been reinforcing regulations to require platforms to proactively mitigate these risks.”
A bipartisan bill in US Congress aims to require platforms to implement ‘reasonable measures’ to protect children from bullying, harassment and grooming, while countries like Singapore and India have already passed strict internet laws, which impose a strong duty of care on platforms.
Failure to comply can also result in financial penalties. “In the EU, the Digital Services Act can levy fines up to 6% of a company’s worldwide annual turnover for failing to implement and report user safety policies and procedures, and the UK’s Online Safety Act can go up to 10% – that’s a huge sum for any sized company,” says Pappas.
Indeed, this consequence already occurred in 2022 when Epic Games paid $275 million in a settlement with the Fair Trade Commission (FTC), due to claims of violating the US Children’s Online Privacy Protection Rule (COPPA) through mishandling of children’s personal data, but also in part due to lack of safety protections for minors.
ToxMod: not a cost but a revenue driver
It can be easy to consider content moderation as simply the cost of doing business in live service games. But while there is an upfront cost to implementing ToxMod, it’s going to work out not only cheaper than the risk of falling foul of regulation and financial penalties, but Pappas explains those costs would be more than covered by the boost to the bottom line.
Take a hypothetical studio with around one million monthly active users (MAUs). While the cost of ToxMod will depend on how heavily players use voice chat, even at $10,000 per month it could have clear financial benefits. And that’s because studios can expect about 40% of those one million MAUs are getting exposed to toxicity each month, with roughly 10-12% (at least 100,000 players) churning monthly.
“If each of those monthly users generates even $1 per month, then that’s $100,000 lost per month,” says Pappas. But with ToxMod implemented, preventing that churn would mean recovering $100,000 per month, in other words 10 times more than the cost to implement it.
In one title using ToxMod, Modulate tracked the number of active players in the game after several weeks. After just three days, there was an increase of 6.3% active players, while by day 21 this had increased to 27.9% more active players.
"The costs of toxicity far outweigh the costs of content moderation”
When taking in account the studies that show user spending increases in more positive spaces, then that potentially means those increased active players are also more likely to spend more on in-app purchases, further improving the studio’s bottom line.
This is before considering how ToxMod’s voice-native technology also reduces the mental cost to moderation teams. “We built ToxMod to help sift through the proverbial haystack and identify the worst, most urgent harms,” Pappas explains. “This allows moderators to prioritize where they can have the most impact - and in turn, have a much greater impact-per-hour, alleviating some of the pressure to be racing from one horrible situation to the next.”
By minimizing time needed to listen to harmful audio, moderators using ToxMod are able to mitigate five to ten times more harms than moderators going through the arduous process of reviewing audio manually. While ToxMod uses machine learning, having been trained on tens of millions of hours of gaming-specific voice chat, Pappas also stresses that it is a tool designed to be paired with a studio’s moderation team. “Their moderators can review ToxMod’s recommendations, and always have the final say on any action that will be taken on their end users.”
In closing, he says, “Considering the fact that cleaning up toxicity generally results in favorable media coverage and consumer sentiment; plus the increased player trust generated by more consistently taking action against offenders, it becomes a no-brainer: the costs of toxicity far outweigh the costs of content moderation.”