To make off-screen interaction without specialized hardware practical, we investigate using deep learning methods to process the common built-in IMU sensor (accelerometers and gyroscopes) on mobile phones into a useful set of one-handed interaction events. We present the design, training, implementation and applications of TapNet, a multi-task network that detects tapping on the smartphone. With phone form factor as auxiliary information, TapNet can jointly learn from data across devices and simultaneously recognize multiple tap properties, including tap direction and tap location. We developed two datasets consisting of over 135K training samples, 38K testing samples, and 32 participants in total. Experimental evaluation demonstrated the effectiveness of the TapNet design and its significant improvement over the state of the art. Along with the datasets, (https://sites.google.com/site/michaelxlhuang/datasets/tapnet-dataset), and extensive experiments, TapNet establishes a new technical foundation for off-screen mobile input.
翻译:为了在没有专门硬件的情况下进行屏幕外互动,我们调查使用深层学习方法,将移动电话上通用的内置IMU传感器(加速计和陀螺仪)处理成一套有用的单手互动活动;我们介绍TapNet的设计、培训、实施和应用,这是一个多任务网络,探测智能手机的利用情况;用电话形式要素作为辅助信息,TapNet可以共同从跨设备的数据中学习,并同时识别多个自来量特性,包括自来量方向和自来量位置;我们开发了两套数据集,由135K以上培训样本、38K测试样本和32名参与者组成;实验性评估展示了TapNet设计的有效性及其对艺术状态的重大改进;连同数据集一起(https://sites.gogle.com/site/michaelxlhuang/datapset/tapnet-dataset)和广泛的实验,TapNet为非屏幕移动输入建立了新的技术基础。