Today's online platforms rely heavily on recommendation systems to serve content to their users; social media is a prime example. In turn, recommendation systems largely depend on artificial intelligence algorithms to decide who gets to see what. While the content social media platforms deliver is as varied as the users who engage with them, it has been shown that platforms can contribute to serious harm to individuals, groups and societies. Studies have suggested that these negative impacts range from worsening an individual's mental health to driving society-wide polarisation capable of putting democracies at risk. To better safeguard people from these harms, the European Union's Digital Services Act (DSA) requires platforms, especially those with large numbers of users, to make their algorithmic systems more transparent and follow due diligence obligations. These requirements constitute an important legislative step towards mitigating the systemic risks posed by online platforms. However, the DSA lacks concrete guidelines to operationalise a viable audit process that would allow auditors to hold these platforms accountable. This void could foster the spread of 'audit-washing', that is, platforms exploiting audits to legitimise their practices and neglect responsibility. To fill this gap, we propose a risk-scenario-based audit process. We explain in detail what audits and assessments of recommender systems according to the DSA should look like. Our approach also considers the evolving nature of platforms and emphasises the observability of their recommender systems' components. The resulting audit facilitates internal (among audits of the same system at different moments in time) and external comparability (among audits of different platforms) while also affording the evaluation of mitigation measures implemented by the platforms themselves.
翻译:今天的在线平台严重依赖建议系统来为用户提供内容;社交媒体是一个主要的例子。反过来,建议系统主要依赖人工智能算法来决定谁能看到什么。虽然社交媒体平台提供的内容与参与这些平台的用户一样多种多样,但已经表明平台可以对个人、群体和社会造成严重伤害。研究表明,这些负面影响从个人心理健康恶化到推动全社会两极分化,从而有可能使民主面临风险。为了更好地保护人们免受这些伤害,欧盟的数字服务法(DSA)要求平台,特别是拥有大量用户的平台,使其算法系统更加透明,并遵循尽职调查义务。尽管社交媒体平台提供的内容与参与这些平台的用户一样,是缓解在线平台所构成的系统性风险的重要立法步骤。然而,DSA缺乏具体的指导方针来实施一个可行的审计进程,让审计师能够对这些平台负责。这一缺陷可以促进“审计洗礼”的推广,即利用审计平台来使其做法合法化,忽视责任。为了填补这一空白,我们还建议在审计过程中采用一个风险-风险-审计周期方法,同时在审计过程中也建议一个风险-审计周期性方法的升级。