Frog Social recognized the importance of providing a secure and meaningful environment for its young user base from the outset of the app's development. They formed a content moderation team and implemented several tools to filter all user-generated content (UGC), namely usernames and chat messages, between the users.
“It's such a great feeling to know that we are now providing our users, especially the young ones, safe environment on our platform. Utopia’s team has been very professional and amazing to work with - they spent substantial time to understand what we needed and ultimately delivered on all their promises, and then some. We were impressed by how quickly we saw results from the AI model, even with tricky moderation cases, in a matter of weeks. Plus, they've been super attentive to our needs”
Chief Operations Officer at Frog Social
1. Earlier approaches and challenges
Initially Frog’s team relied on blacklists and filters, hoping to ensure respectful conversations. However, despite their best efforts, they observed that users managed to cheat their rule-based content moderation tools, simply by replacing letters with numbers, misspelling words, using symbols and slang, and writing hurtful sentences using perfectly acceptable words. They noticed that while most users adhered to their community guidelines, there were some "bad apples" that spread antisocial behavior. Bullying, disrespectful attitude, and other inappropriate behavior had started to impact their well-behaving users negatively. Since offering the best user experience was a priority from the very beginning of the creation of the app, Frog team wanted to make sure that the inappropriate messages that were slipping through the cracks of their moderation systems would be stopped once and for all.
Simultaneously, Frog Social was experiencing a significant increase in their user base, resulting in very large chat volumes. The situation started to pose a significant challenge for their content moderation team. They struggled to keep up with the frequency and methods used by bad actors who were not behaving according to the app’s guidelines and rules of conduct. The team at Frog realised that they needed to do something different, and be proactive before this would become an issue. As a result, they decided to seek a partner with relevant expertise to assist them in addressing these issues.
2. Objectives and solution
Frog Social was committed to creating a safe and amicable environment for children, teenagers and their parents alike. However, when it came to moderation of chat messages, it was challenging. Once a message is sent, the recipient is supposed to receive it within seconds, making it impossible for the moderation team to address the harmful content within this time. It was evident to them that they required a solution that could moderate chat messages with the same proficiency as a human would, including the ability to understand the context, but to do it in real time. The team at Frog turned to Utopia to test the Utopia AI Moderator solution.
Despite not having their database labeled, i.e. which chat messages were appropriate and which were not, with the assistance of Utopia’s team, it only took a couple of days to label a few thousand chat messages, which was sufficient as the initial training data. Two weeks later a tailor-made production quality AI model was delivered to Frog Social to put it to the test.
Frog Social was very pleased with the performance of Utopia’s tailor-made AI moderation model, as it was capable of identifying inappropriate content they previously could not filter out with the tools they have tried in the past. Utopia AI Moderator proved effective in detecting and halting toxic content that had previously passed through undetected, including content that contained special characters, numbers, inappropriate emoticons, and Personally Identifiable Information (PII), etc.
The implementation of Utopia AI has since remarkably reduced the workload of Frog Social’s content moderation team, as it operates with an impressive 99,6% accuracy rate, around the clock, freeing up Frog team’s time and allowing them to concentrate on other pressing tasks that they also previously did not have time to tend to. A very small number of messages are still forwarded to the Frog’s team for additional review, and these are used to update and improve their AI model’s performance even further.
About Frog Social
Frog Social app is a social network that connects children and teenagers to make friends. The Frog Social app was founded by two students from the London School of Economics (LSE).
To date, they have welcomed several team members to the project, all working together to build and cultivate a social media app for the new generation. Their aim is to make positive changes to the social lives of their peers.
The idea for Frog Social app was first conceived when the founders participated in a college competition. They saw an opportunity to create a platform that would allow their generation to connect with friends in a fresh and authentic way, without filters. Today, Frog Social app is gaining traction in Texas, California, and London.