According to a recent research, Call of Duty game players are the most likely to use derogatory language when engaging with other members of the community. Wordtips data shows the amount of bad terms per 1,000 used by 100 independent Twitter users devoted to specific gaming fanbases, and Call of Duty is at the top of the list. With 184 out of 1,000 words having a bad connotation, roughly 18% of what these Twitter users were stating was labeled as negative or offensive.
No one is surprised by these results; Call of Duty players have been under fire — metaphor meant — for years due to their notoriety for toxicity. Bullying and intimidation are prevalent in live chat rooms, drawing harsh condemnation from others online. It doesn’t help that Activision’s morals are already shaky; both the business and its fans have been severely chastised for their permissive attitude toward harassment in the gaming and corporate spheres. Due to this ongoing occurrence, Call of Duty has become a social media laughingstock, with jokes and memes circulating in all parts of the internet.
come over we’re playin call of duty and bullying kids on twitch— Mike Scollins (@mikescollins) March 1, 2023
However, Call of Duty isn’t the only gaming group in trouble. Other fandoms noted for using negative wording in the research included Mortal Kombat and Sonic the Hedgehog. Bullying and abuse have become an obvious problem within the community, with numerous businesses clamping down on the use of negative language in chat rooms in an effort to protect their participants. Words deemed obscene or derogatory can be filtered out of text chat, while advances in AI technology have enabled live chat monitoring and reporting of abuse or harassment in vocal talks.
While these anti-harassment policies appear to be genuine efforts to assist the community, they are only a band-aid for a bigger problem: why are these individuals going out of their way to harass and berate others in the first place. Reporting, barring, and banning are excellent tools for protecting those who have been impacted, but knowing what motivates people to use aggressive language online is a problem that cannot be solved by filtering out bad words.