Most mobile health apps employ data visualization to help people view their health and activity data, but these apps provide limited support for visual data exploration. Furthermore, despite its huge potential benefits, mobile visualization research in the personal data context is sparse. This work aims to empower people to easily navigate and compare their personal health data on smartphones by enabling flexible time manipulation with speech. We designed and developed Data@Hand, a mobile app that leverages the synergy of two complementary modalities: speech and touch. Through an exploratory study with 13 long-term Fitbit users, we examined how multimodal interaction helps participants explore their own health data. Participants successfully adopted multimodal interaction (i.e., speech and touch) for convenient and fluid data exploration. Based on the quantitative and qualitative findings, we discuss design implications and opportunities with multimodal interaction for better supporting visual data exploration on mobile devices.
翻译:大多数移动保健应用软件都采用数据可视化来帮助人们查看其健康和活动数据,但这些应用软件为视觉数据探索提供了有限的支持。此外,尽管具有巨大的潜在益处,个人数据方面的移动可视化研究却很少。这项工作的目的是使人们能够以灵活的方式用语言进行时间操纵,从而在智能手机上轻松导航和比较个人健康数据。我们设计并开发了Data@Hand这一移动应用软件,它利用了两种互补方式的协同作用:语言和触摸。我们通过与13个长期Fitbit用户的探索性研究,审查了多式联运互动如何帮助参与者探索自己的健康数据。与会者成功地采用了多式互动(即言语和触摸)来方便和流体数据探索。根据定量和定性调查结果,我们讨论了与多式互动在设计影响和机会,以便更好地支持移动设备上的视觉数据探索。