Content moderation practices and technologies need to change over time as requirements and community expectations shift. However, attempts to restructure existing moderation practices can be difficult, especially for platforms that rely on their communities to conduct moderation activities, because changes can transform the workflow and workload of moderators and contributors' reward systems. Through the study of extensive archival discussions around a prepublication moderation technology on Wikipedia named Flagged Revisions, complemented by seven semi-structured interviews, we identify various challenges in restructuring community-based moderation practices. We learn that while a new system might sound good in theory and perform well in terms of quantitative metrics, it may conflict with existing social norms. Our findings also highlight how the intricate relationship between platforms and self-governed communities can hinder the ability to assess the performance of any new system and introduce considerable costs related to maintaining, overhauling, or scrapping any piece of infrastructure.
翻译:暂无翻译