The Computers Are Social Actors (CASA) paradigm suggests people exhibit social/anthropomorphic biases in their treatment of technology. Such insights have encouraged interaction designers to make automated systems act in more social (chatty or even friend-like) ways. However, like typical dark patterns, social-emotional responses to systems as (seemingly sentient) agents can be harnessed to manipulate user behaviour. An increasingly common example is app notifications that assume person-like tones to persuade or pressure users into compliance. Even without manipulative intent, difficulties meeting contextual social expectations can make automated social acting seem rude, invasive, tactless, and even disrespectful -- constituting social `anti-patterns'. This paper explores ways to improve how automated systems treat people in interactions. We mixed four qualitative methods to elicit user experiences and preferences regarding how interfaces ``talk'' to/at them. We identify an emerging `social' class of dark and anti-patterns, and propose guidelines for helping (`social') interfaces treat users in more respectful, tactful, and autonomy-supportive ways.
翻译:计算机是社会行为者(CASA)的范式表明,人们在对待技术时表现出社会/人类偏见。这种洞察力鼓励互动设计者使自动化系统以更加社会化的方式(如聊天或类似朋友的方式)运作。然而,像典型的黑暗模式一样,可以利用社会-情感对系统的反应(似乎具有感知性)来操纵用户的行为。一个日益常见的例子是,应用通知假定某人喜欢说服或迫使用户遵守。即使没有操纵的意图,满足背景社会期望的困难也可以使自动社会行为显得粗鲁、侵犯性、无动于衷甚至不尊重性 -- -- 形成社会“反模式”的构成社会行为。本文探讨了如何改进自动化系统在互动中对待人的方式。我们混合了四种定性方法,以吸引用户在如何与用户的界面“交谈”方面的经验和偏好。我们确定了一个新兴的“社会”黑暗和反对话者阶层,并提出了帮助(社会”用户以更尊重、更灵活和自主的方式对待用户的指导方针。