Social media moderation policies are often at the center of public debate, and their implementation and enactment are sometimes surrounded by a veil of mystery. Unsurprisingly, due to limited platform transparency and data access, relatively little research has been devoted to characterizing moderation dynamics, especially in the context of controversial events and the platform activity associated with them. Here, we study the dynamics of account creation and suspension on Twitter during two global political events: Russia's invasion of Ukraine and the 2022 French Presidential election. Leveraging a large-scale dataset of 270M tweets shared by 16M users in multiple languages over several months, we identify peaks of suspicious account creation and suspension, and we characterize behaviours that more frequently lead to account suspension. We show how large numbers of accounts get suspended within days from their creation. Suspended accounts tend to mostly interact with legitimate users, as opposed to other suspicious accounts, often making unwarranted and excessive use of reply and mention features, and predominantly sharing spam and harmful content. While we are only able to speculate about the specific causes leading to a given account suspension, our findings shed light on patterns of platform abuse and subsequent moderation during major events.
翻译:社会媒体温和政策往往成为公共辩论的中心,其实施和颁布有时被神秘的面纱所包围。 令人惊讶的是,由于平台透明度和数据访问有限,对温和动态的定性研究相对较少,特别是在有争议的事件和与之相关的平台活动方面。在这里,我们研究了两个全球政治事件:俄罗斯入侵乌克兰和2022年法国总统选举期间在Twitter上创建和暂停账户的动态:俄罗斯入侵乌克兰和法国总统大选。利用16M用户以多种语言分享的270M推特的大规模数据集,几个月来,我们发现了可疑账户创建和中止的高峰,我们描述的行为更经常导致账户暂停。我们显示了大量账户在创建后几天内被中止的情况。暂记账户往往与其他可疑账户相比,大多与合法用户互动,经常不必要和过度地使用答复和提及特征,并主要分享垃圾邮件和有害内容。 虽然我们只能猜测导致账户暂停的具体原因,但我们只能对重大事件期间的平台滥用和随后节制模式做出判断。