With the 2022 US midterm elections approaching, conspiratorial claims about the 2020 presidential elections continue to threaten users' trust in the electoral process. To regulate election misinformation, YouTube introduced policies to remove such content from its searches and recommendations. In this paper, we conduct a 9-day crowd-sourced audit on YouTube to assess the extent of enactment of such policies. We recruited 99 users who installed a browser extension that enabled us to collect up-next recommendation trails and search results for 45 videos and 88 search queries about the 2020 elections. We find that YouTube's search results, irrespective of search query bias, contain more videos that oppose rather than support election misinformation. However, watching misinformative election videos still lead users to a small number of misinformative videos in the up-next trails. Our results imply that while YouTube largely seems successful in regulating election misinformation, there is still room for improvement.
翻译:随着2022年美国中期选举的临近,关于2020年总统选举的阴谋性指控继续威胁用户对选举过程的信任。为了规范选举错误信息,YouTube推出了一些政策,将此类内容从搜索和建议中删除。在本文中,我们对YouTube进行了为期9天的众源审计,以评估这些政策的颁布程度。我们招聘了99个用户,他们安装了一个浏览器扩展功能,使我们能够收集未来建议线索,搜索关于2020年选举的45个视频和88个搜索查询结果。我们发现YouTube的搜索结果,不论搜索偏差如何,都包含更多反对而不是支持选举错误信息的视频。然而,看错误的信息化选举视频仍然导致用户在最新路径上看到少量信息化视频。我们的结果表明,尽管YouTube在管理选举错误信息方面似乎基本成功,但仍有改进的余地。