Toxicity is such a broad word, so it’s really defined by each gaming company, social platform, and individual moderator in different ways. Something that a children’s social platform would consider toxic is very different from what an adult game would call toxic.
If we define toxicity as the act of disturbing social interactions to the point that the other user would consider not continuing the conversation or even leaving the platform, the crux of content moderation remains the same no matter the platform. Above all, the most important aspect of assessing toxicity and online harassment is the impact on users, and how to put a stop to it in the first place.
The pandemic has meant that people are even more reliant on digital means of communicating and socialising, so moderation has only become more critical. Gamers are passionate about the games they play, to the point where it becomes an important part of their day to day lives. For many, it’s not just an online interaction, but an important part of their social life, or a way to relax and unplug at the end of a long day. If toxicity begins to creep in, then their favourite hangout space is suddenly disrupted, so it’s essential these communities are protected.