Social media platforms have implemented automated content moderation tools to preserve community norms and mitigate online hate and harassment. Recently, these platforms have started to offer Personalized Content Moderation (PCM), granting users control over moderation settings or aligning algorithms with individual user preferences. While PCM addresses the limitations of the one-size-fits-all approach and enhances user experiences, it may also impact emergent outcomes on social media platforms. Our study reveals that PCM leads to asymmetric information loss (AIL), potentially impeding the development of a shared understanding among users, crucial for healthy community dynamics. We further demonstrate that PCM tools could foster the creation of echo chambers and filter bubbles, resulting in increased community polarization. Our research is the first to identify AIL as a consequence of PCM and to highlight its potential negative impacts on online communities.
翻译:暂无翻译