Toxic content is a real risk to media companies

June 13, 2018
Toxic content is a real risk to media companies

In the recent international MINDS and Newscamp media conferences, I held presentations challenging the companies to use AI tools for automatic moderation in order to increase the quality of the content and customer engagement. The presentation attracted particular interest highlighting the fact that media companies are vulnerable to toxic user behavior and interested in curbing it. To understand why, here is a short look into the impacts of online toxicity.


Many media companies are deeply worried about the negative effect that toxic content can have on their carefully designed brands. A recent Pew study revealed that 80 % of the people consider online services responsible for the bad content and harassment taking place on their platforms. This means that media companies need to manage all the content regardless of who produced it as all negative content can potentially damage the brand. Once the brand is damaged it takes a lot of time and money to regain the positive reputation.

80 % of the people consider online services responsible for the bad content and harassment

The user online communities are very vulnerable to toxicity and trolling. According to a 2017 study from Stanford and Cornell Universities, only few bad comments can be enough to flip over an online conversation and anyone can become a troll given the right circumstances. Even a small number of trolls can be devastating to the positive atmosphere on the service and in the long run destroy the feeling of community. While lively user communities fostering customer loyalty are painstakingly slow to build they can be easily lost in the absence of moderation.

Time spent on the service

Some years ago Riot Game’s Jeffrey Lin commented that based on their study users experiencing toxicity in their games are over three times more likely to leave. Similarly in media bad content has been shown to lead to degradation of the service and customer churn. While participation and high-quality discussions increase engagement and time spent on the service, bad content has the opposite effect.

Users experiencing toxicity in their games are three times more likely to leave

I had lengthy discussions with several media representatives in the conferences. It seemed that most of the companies are still pondering how to cope with the content management problem. Some have chosen to limit the opportunities for participation while others are using often laborious combinations of manual and automatic moderation. Some companies have simply given up and hope for the best. The best moderation and content management practices are still evolving. However, I am certain that automatic AI based real-time moderation tools such as our Utopia AI Moderator will play a major role in the content management in the future.

Mari-Sanna Paukkeri

You may also like

No items found.

Book a free 30-minute consultation with one of our AI experts



How Tech Savvy Are You? Learn how Utopia AI Moderator is compared to competing moderation solutions.