Towards Detecting Contextual Real-Time Toxicity for In-Game Chat

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: NLP Applications
Submission Track 2: Computational Social Science and Cultural Analytics
Keywords: Real-Time Toxicity Detection, Game Chat Toxicity, Game Chat Moderation
TL;DR: ToxBuster, a model designed for real-time toxic chat detection, utilizes chat history and metadata effectively to achieve high precision and recall required for real-world deployment, and can be effectively applied to post-game moderation.
Abstract: Real-time toxicity detection in online environments poses a significant challenge, due to the increasing prevalence of social media and gaming platforms. We introduce ToxBuster, a simple and scalable model that reliably detects toxic content in real-time for a line of chat by including chat history and metadata. ToxBuster consistently outperforms conventional toxicity models across popular multiplayer games, including Rainbow Six Siege, For Honor, and DOTA 2. We conduct an ablation study to assess the importance of each model component and explore ToxBuster's transferability across the datasets. Furthermore, we showcase ToxBuster's efficacy in post-game moderation, successfully flagging 82.1% of chat-reported players at a precision level of 90.0%. Additionally, we show how an additional 6\% of unreported toxic players can be proactively moderated.
Submission Number: 3609
Loading