YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform has also come under fire for hosting inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content are the Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that, during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% chance of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content.
翻译:YouTube是全世界最大的用户生成视频内容主机。 唉, 这个平台也因为接收不适当、有毒和令人憎恶的内容而遭到火化。 一个通常与分享和发布仇恨和厌恶女性内容相关联的社区是非自愿的Celibates(Incels), 这是一种定义松散的运动,表面上以男性问题为重点。 在本文中, 我们准备在YouTube上分析Incel社群, 重点是这个社群在过去十年的演变, 并了解YouTube的建议算法是否引导用户使用Incel相关视频。 我们收集了Reddid 的Incel 社群共享的视频, 并对YouTube上发布的内容进行了数据驱动描述。 我们发现, 在YouTube上Incel 社群正在获得牵线, 过去十年里, 与Incel 相关的视频和评论的数量大幅上升。 我们还发现, 当从与Incel无关的视频开始时, 用户有6.3%的机会被YouTube 的建议算法建议到五个跳动的视频。 总之, 我们的调查结果描绘了一个令人惊恐的在线激进化的图像: 也正在不断发挥这种极端的平台。