Client-Side Scanning (CSS) see in the Child Sexual Abuse Material Detection (CSAMD) represent ubiquitous mass scanning. Apple proposed to scan their systems for such imagery. CSAMD was since pushed back, but the European Union decided to propose forced CSS to combat and prevent child sexual abuse and weaken encryption. CSS is mass surveillance of personal property, pictures and text, without considerations of privacy and cybersecurity and the law. We first argue why CSS should be limited or not used and discuss issues with the way pictures cryptographically are handled and how the CSAMD preserves privacy. In the second part, we analyse the possible human rights violations which CSS in general can cause within the regime of the European Convention on Human Rights. The focus is the harm which the system may cause to individuals, and we also comment on the proposed Child Abuse Regulation. We find that CSS is problematic because they can rarely fulfil their purposes, as seen with antivirus software. The costs for attempting to solve issues such as CSAM outweigh the benefits and is not likely to change. The CSAMD as proposed is not likely to preserve the privacy or security in the way of which it is described source materials. We also find that CSS in general would likely violate the Right to a Fair Trial, Right to Privacy and Freedom of Expression. Pictures could have been obtained in a way that could make any trial against a legitimate perpetrator inadmissible or violate their right for a fair trial, the lack of any safeguards to protect privacy on national legal level, which would violate the Right for Privacy, and it is unclear if the kind of scanning could pass the legal test which Freedom of Expression requires. Finally, we find significant issues with the proposed Regulation, as it relies on techno-solutionist arguments and disregards knowledge on cybersecurity.
翻译:儿童性虐待材料检测(CSAMD)显示,儿童性虐待材料检测(CSSS)显示,儿童性虐待材料检测(CSSS)是无处不在的大规模扫描。苹果公司提议扫描其系统,以获取此类图像。CSAMD后来被推后,但欧盟决定提出强迫CSS打击和防止儿童性虐待和削弱加密。CSS是对个人财产、图片和文本的大规模监控,而没有隐私和网络安全及法律的考虑。我们首先争论为什么CSS应该受到限制或不使用,讨论如何处理图片的加密处理方式和CSAMD如何保护隐私的问题。在第二部分,我们分析CSS可能普遍造成人权侵犯的可能人权。该系统可能对个人造成伤害,我们还评论拟议的《儿童虐待条例》。我们认为,CSS之所以有问题,是因为它们很难达到目的,如用反病毒软件所显示的那样,试图解决像CSAM这样的问题的成本可能超过利益,也不可能改变隐私。 提议CSAMD可能无法保护隐私或隐私,而CMDR可能无法保护国家隐私的隐私,我们最终要通过法律上的言论自由来证明,我们可能要侵犯隐私和言论自由。