Analyzing usability test videos is arduous. Although recent research showed the promise of AI in assisting with such tasks, it remains largely unknown how AI should be designed to facilitate effective collaboration between user experience (UX) evaluators and AI. Inspired by the concepts of agency and work context in human and AI collaboration literature, we studied two corresponding design factors for AI-assisted UX evaluation: explanations and synchronization. Explanations allow AI to further inform humans how it identifies UX problems from a usability test session; synchronization refers to the two ways humans and AI collaborate: synchronously and asynchronously. We iteratively designed a tool, AI Assistant, with four versions of UIs corresponding to the two levels of explanations (with/without) and synchronization (sync/async). By adopting a hybrid wizard-of-oz approach to simulating an AI with reasonable performance, we conducted a mixed-method study with 24 UX evaluators identifying UX problems from usability test videos using AI Assistant. Our quantitative and qualitative results show that AI with explanations, regardless of being presented synchronously or asynchronously, provided better support for UX evaluators' analysis and was perceived more positively; when without explanations, synchronous AI better improved UX evaluators' performance and engagement compared to the asynchronous AI. Lastly, we present the design implications for AI-assisted UX evaluation and facilitating more effective human-AI collaboration.
翻译:分析用量测试视频是艰巨的。虽然最近的研究表明AI在协助执行这些任务方面有希望,但人们基本上仍不知道如何设计AI来帮助用户经验(UX)评估员和AI 。 受人类和AI合作文献中机构和工作背景概念的启发,我们研究了人工辅助UX评估的两种对应设计因素:解释和同步。解释使AI能够进一步告知人类如何从使用测试会发现UX问题; 同步是指人类和AI合作的两种方式:同步和不同步。我们反复设计了一个工具,AI A助理,我们设计了一个四种版本的UIA A助理,该工具与用户经验(有/无)和同步(有/无)和同步(有/无)和同步(有/有/有/有/有/有/有/有)对应。我们研究了人工辅助辅助UX的机构和24 UX 评估员与24 UX 评估员进行了混合方法研究,确定UX 使用助理的UX 测试视频中存在UX问题。 我们的定量和定性结果显示,通过解释,无论我们是否同步或尽可能同步或同步,我们作了解释,但更准确地解释,更准确地解释,也更好解释,也更好解释,更便利进行AIX,并改进了AI-X 评价评价评价评价评价,更有助于 评估,更有助于 评估 分析 分析,更有助于,更有助于 进行 进行 进行 进行