Toxicity and the hidden dangers of shoddy moderation

April 8, 2022
Toxicity and the hidden dangers of shoddy moderation

Janne Huuskonen, Director, Marketing and Communications at Utopia Analytics

We think of games as a natural safe space; it’s a medium which is all about excitement, immersion and fun. Games are a place where we can be the hero and do impossible things. But for a significant number of gamers, their experience has become tainted by a toxic minority, as moderation fails to keep pace with the shift to online, communal gaming.

Last year, Utopia Analytics created a report, Playing games online in 2021: Toxicity, misogyny and missing moderation, to shine a light on the amount of toxicity the average gamer experiences. The report, which surveyed more than 1000 players in the US found that a staggering 70% had experienced harassment whilst playing online multiplayer games. What this tells us in no uncertain terms is that whatever moderation strategies these games may have in place simply aren’t up to the task.

Effective moderation must always be focused on the player so they are shielded from inappropriate or outright harmful content. But poor moderation has a much wider impact than just the experience of the player; ineffective moderation can also have a debilitating effect on a company’s bottom line, through player churn and bad publicity.

The brand risk from poor moderation

Toxic content can have a tremendous impact on the image of a carefully curated brand. In fact, a Pew study revealed that 80% of the people consider online services as directly responsible for the bad content and harassment taking place on their platforms. When community leaders allow harmful content to be published, it’s seen as an endorsement of the behaviour – despite efforts of the platform holders to remain content-neutral.

According to a 2017 study from Stanford and Cornell Universities, even a small number of bad comments can have a snowball effect that can turn a civil online conversation into a toxic one. Its research also showed that toxicity is contagious, as simply being exposed to inappropriate content made users much more likely to reciprocate and act inappropriately themselves. Even a small number of toxic players can be devastating to the positive atmosphere and in the long run, destroy the feeling of community.

Gaming companies and communities must manage all the content regardless of who produced it. This kind of reputation is easy to earn, but difficult to lose, and rebuilding brand trust can take a lot of time and money.

Feel the churn

A study by Riot Games of its hugely popular League of Legends found that toxicity has tangible effects on player retention, as first-time players were 320% more likely to churn immediately and never play again.

While Blizzard’s well-publicised issues around misogyny and poor leadership isn’t a moderation issue it’s still a toxicity problem, and it’s clear to see the effects that this kind of negative press brings. Player numbers across all of the company’s titles have nearly halved since 2017, from 46 million players to 26 million, with some commentators suggesting the trend has accelerated since the so-called “Cosby Suite” came to the public’s attention.

One of World of Warcraft’s popular streamers, Asmongold, made headlines last year after a very public spat with Blizzard. Since then, he’s seen players following him in droves as he swapped WoW for Final Fantasy XIV. In stark contrast to World of Warcraft’s well-known toxicity problems, FF XIV has been praised for its open and supportive community.

Things like this are significant because gameplay streaming has become a vital, and highly profitable, piece of the marketing and brand building puzzle. But it also gives disgruntled players a very public platform to shout about things they dislike, and those with enough clout can have a major impact on a game.

Sponsors and brands don’t want to associate with toxic communities

As player numbers continue to swell, gaming has become the perfect environment for advertisers to reach wider audiences than ever. Courting high-profile partnerships and collaborating with well-known brands has become a big part of the business of games. From in-game fashion items and movie launches to exclusive limited-time events – who wouldn’t want to wield Thanos’ Infinity Gauntlet in Fortnite?

But without the right moderation strategy in place, brands won’t want to collaborate with a specific game if it’s well-known for toxicity, as it could tarnish their reputation by association.

This is even more pressing in competitive gaming, much like traditional sports sponsorship is a key revenue stream, with as much as 50% of Esports revenue coming from it. Activision Blizzard’s current struggles have also seemed to lead T-Mobile to pull sponsorship for the Call of Duty and Overwatch professional Esports leagues, and depending on the result of the active lawsuit against it – others could follow. So it’s vital the major platforms in Esports look to the right moderation tools, to protect their interests and appeal to the widest variety of brands possible, without having toxic players, fans or even bots put off potential business partners.

That’s not to say that major titles haven’t been trying to stem the flow of hate speech in their communities. Industry leaders are grappling with how to better moderate and manage their communities. Discord recently acquired AI moderator Sentropy, Unity followed suit with its acquisition of voice chat moderator OTO, showing that calls for change are being heard. But it seems that many companies are still unsure of how to deal with the content management problem.

Moderation needs to keep pace with the ways we now play games

So how can game publishers make sure their communities are a safe place for players?

Right now, moderation – if it’s in place at all – is likely to be a combination of basic content filters, banned word lists and human moderation. Lots of games and platforms don’t even have this, and instead, rely on the players themselves to report toxicity. It’s no wonder so many gamers are reporting rising harassment and toxicity, if the approach to moderation is so woefully unprepared.

We are now in an era of games as a service, where Fortnite, Minecraft and Roblox can have hundreds of millions of players across multiple platforms, and console owners can stream their games to a mobile phone or smart TV. The way we moderate games and safeguard players must fit with these models, and the only way that can be done is to use automation, AI and other technologies that can cope with the scale and complexity of the games industry as it is today and will be tomorrow.

Want to learn more?

Check out our case studies or contact us if you have questions or want a demo.

Message sent! Thank you.

An error has occurred somewhere and it is not possible to submit the form. Please try again later or contact us via email.

You may also like

No items found.

Book a free 30-minute consultation with one of our AI experts

BOOK A DEMO

Blog

How Tech Savvy Are You? Learn how Utopia AI Moderator is compared to competing moderation solutions.