Dual use, the intentional, harmful reuse of technology and scientific artefacts, is a problem yet to be well-defined within the context of Natural Language Processing (NLP). However, as NLP technologies continue to advance and become increasingly widespread in society, their inner workings have become increasingly opaque. Therefore, understanding dual use concerns and potential ways of limiting them is critical to minimising the potential harms of research and development. In this paper, we conduct a survey of NLP researchers and practitioners to understand the depth and their perspective of the problem as well as to assess existing available support. Based on the results of our survey, we offer a definition of dual use that is tailored to the needs of the NLP community. The survey revealed that a majority of researchers are concerned about the potential dual use of their research but only take limited action toward it. In light of the survey results, we discuss the current state and potential means for mitigating dual use in NLP and propose a checklist that can be integrated into existing conference ethics-frameworks, e.g., the ACL ethics checklist.
翻译:双倍使用,是技术和科学工件故意有害复用的问题,在自然语言处理(NLP)领域中尚未得到很好的定义。然而,随着 NLP 技术的不断发展和在社会中的越来越广泛应用,它们内在的运作方式已经变得越来越模糊。因此,理解双重使用的问题和限制双重使用的潜在方法对于最小化研究和开发的潜在危害至关重要。在本文中,我们对 NLP 研究人员和从业者进行了一项调查,以了解问题的深度和他们的观点,以及评估现有的支持情况。基于调查结果,我们提供了一个适合 NLP 研究人员需要的双重使用的定义。调查显示,大多数研究人员都担心自己的研究可能被双重使用,但只采取了有限的行动。鉴于调查结果,我们讨论了目前在 NLP 中缓解双重使用的现状和潜在方法,并提出了一个清单,可以集成到现有的会议道德框架中,例如 ACL 道德清单。