Personal moderation tools on social media platforms allow users to control their feeds by configuring the acceptable toxicity thresholds for their feed content or muting inappropriate accounts. This research examines how the end-user configuration of these tools is shaped by four critical psychosocial factors - fear of missing out (FoMO), social media addiction, subjective norms, and trust in moderation systems. Findings from a nationally representative sample of 1,061 participants show that FoMO and social media addiction make Facebook users more vulnerable to content-based harms by reducing their likelihood of adopting personal moderation tools to hide inappropriate posts. In contrast, descriptive and injunctive norms positively influence the use of these tools. Further, trust in Facebook's moderation systems also significantly affects users' engagement with personal moderation. This analysis highlights qualitatively different pathways through which FoMO and social media addiction make affected users disproportionately unsafe and offers design and policy solutions to address this challenge.
翻译:暂无翻译