TikTok has seen exponential growth as a platform, fuelled by the success of its proprietary recommender algorithm which serves tailored content to every user - though not without controversy. Users complain of their content being unfairly suppressed by ''the algorithm'', particularly users with marginalised identities such as LGBTQ+ users. Together with content removal, this suppression acts to censor what is shared on the platform. Journalists have revealed biases in automatic censorship, as well as human moderation. We investigate experiences of censorship on TikTok, across users marginalised by their gender, LGBTQ+ identity, disability or ethnicity. We survey 627 UK-based TikTok users and find that marginalised users often feel they are subject to censorship for content that does not violate community guidelines. We highlight many avenues for future research into censorship on TikTok, with a focus on users' folk theories, which greatly shape their experiences of the platform.
翻译:暂无翻译