By filtering the content that users see, social media platforms have the ability to influence users' perceptions and decisions, from their dining choices to their voting preferences. This influence has drawn scrutiny, with many calling for regulations on filtering algorithms, but designing and enforcing regulations remains challenging. In this work, we examine three questions. First, given a regulation, how would one design an audit to enforce it? Second, does the audit impose a performance cost on the platform? Third, how does the audit affect the content that the platform is incentivized to filter? In response, we propose a method such that, given a regulation, an auditor can test whether that regulation is met with only black-box access to the filtering algorithm. We then turn to the platform's perspective. The platform's goal is to maximize an objective function while meeting regulation. We find that there are conditions under which the regulation does not place a high performance cost on the platform and, notably, that content diversity can play a key role in aligning the interests of the platform and regulators.
翻译:通过过滤用户所看到的内容,社交媒体平台有能力影响用户从餐饮选择到投票偏好等观点和决定。这种影响已经引起了仔细审查,许多人呼吁对过滤算法进行监管,但设计和实施监管仍然具有挑战性。在这项工作中,我们检查了三个问题。首先,根据一项监管,如何设计审计来强制执行?第二,审计是否给平台带来绩效成本?第三,审计如何影响平台激励过滤的内容?对此,我们提出了一种方法,根据监管,审计员可以测试该监管是否只以黑箱方式获得过滤算法。我们然后转向平台的观点。该平台的目标是在满足监管的同时最大限度地实现客观功能。我们发现,在有些条件下,监管不会给平台带来高绩效成本,特别是内容多样性在调整平台和监管者利益方面可以发挥关键作用。