YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform also hosts inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content is the so-called Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content.
翻译:YouTube是全世界最大的用户生成视频内容主机。 哦,这个平台也拥有不适当、有毒和令人憎恶的内容。一个通常与分享和出版仇恨和厌恶的社群(Incels)相关联的社群是所谓的非自愿Celibates(Incels),这是一个定义松散的运动,表面上以男性问题为主。在本文中,我们准备分析Incel在YouTube上的Incel社群,重点是该社群在过去十年的演变,并了解YouTube的建议算法是否引导用户进入Incel相关视频。我们收集了Redit内部的Incel社群共享视频,并对YouTube上发布的内容进行了数据驱动描述。我们发现,除其他外,YouTube上的Incel社群正在获得牵线,过去十年里,与Incel有关的视频和评论的数量大幅上升。我们还发现,用户在从与Incel无关的视频开始的五段内被建议使用Incel的视频算法,其中的6.3%被建议使用。总体而言,我们的调查结果描绘出一个令人震惊的在线激进化的图片:不仅在时间上,而且还在平台上活跃。