The Computers Are Social Actors paradigm suggests people exhibit social/anthropomorphic biases in their treatment of technology. Such insights have encouraged the design of interfaces that interact with users in more social (chatty or even friend-like) ways. However, in typical `dark pattern' fashion, social-emotional responses to systems as (seemingly sentient) agents can be harnessed to manipulate user behaviour. An increasingly common example is app notifications that assume person-like tones to persuade or pressure users into compliance. Regardless of being manipulative, difficulties meeting contextual social expectations can make automated social acting seem rude, invasive, tactless, and even disrespectful - constituting 'social' anti-patterns. This paper explores ways to improve how automated systems treat people in interactions. We mixed four qualitative methods to elicit user experiences and preferences regarding how automated systems "talk" to/at them. We identify an emerging 'social' class of dark and anti-patterns, and propose guidelines for helping (social) interfaces treat users in more respectful, tactful, and autonomy-supportive ways.
翻译:计算机是社会行为者的范式表明,人们在对待技术时表现出社会/人类偏见。这种洞察力鼓励了设计与用户互动的界面,这种界面以更加社会化(近似或甚至类似朋友)的方式与用户互动。然而,在典型的“黑暗模式”方式下,可以利用社会-情感反应来操纵用户行为。一个日益常见的例子是,应用程序通知假定某人喜欢说服或迫使用户遵守。尽管是操纵性的,但满足背景社会期望的困难可以使自动社会行为显得粗鲁、侵入性、无动于衷甚至不尊重性――构成“社会”的反模式。本文探讨了如何改进自动化系统如何对待互动的人。我们混合了四种质量方法,以吸引用户在如何自动系统“交谈”和“谈话”方面的经验和偏好。我们确定了一个新的“社会”黑暗和反模式类别,并提出了帮助(社会)用户以更加尊重、灵活和自主的方式对待(社会)界面的指导方针。