This paper considers the problem of designating navigation goal locations for interactive mobile robots. We propose a point-and-click interface, implemented with an Augmented Reality (AR) headset. The cameras on the AR headset are used to detect natural pointing gestures performed by the user. The selected goal is visualized through the AR headset, allowing the users to adjust the goal location if desired. We conduct a user study in which participants set consecutive navigation goals for the robot using three different interfaces: AR Point & Click, Person Following and Tablet (birdeye map view). Results show that the proposed AR Point&Click interface improved the perceived accuracy, efficiency and reduced mental load compared to the baseline tablet interface, and it performed on-par to the Person Following method. These results show that the AR Point\&Click is a feasible interaction model for setting navigation goals.
翻译:本文考虑了为交互式移动机器人指定导航目标位置的问题。 我们提议了一个点点击界面, 以增强现实( AR) 耳机执行 。 AR 耳机上的相机用于检测用户的自然瞄准手势 。 选中的目标通过 AR 耳机可视化, 用户可以按需要调整目标位置 。 我们进行用户研究, 参与者使用三个不同的界面为机器人设定连续导航目标 : AR 点和点击、 人跟踪和 平板( 鸟眼地图视图 ) 。 结果显示, 与基线平板接口相比, 拟议的 AR 点和点击界面提高了人们的感知的准确性、 效率和减少的心理负荷, 并且通过 AR 点 点 和 人遵循的方法进行 。 这些结果显示, AR 点 点是设定导航目标的一个可行的互动模式 。