We study the feasibility of touch gesture behavioural biometrics for implicit authentication of users on a smartglass (Google Glass) by proposing a continuous authentication system using two classifiers: SVM with RBF kernel, and a new classifier based on Chebyshev's concentration inequality. Based on data collected from 30 volunteers, we show that such authentication is feasible both in terms of classification accuracy and computational load on smartglasses. We achieve a classification accuracy of up to 99% with only 75 training samples using behavioural biometric data from four different types of touch gestures. To show that our system can be generalized, we test its performance on touch data from smartphones and found the accuracy to be similar to smartglasses. Finally, our experiments on the permanence of gestures show that the negative impact of changing user behaviour with time on classification accuracy can be best alleviated by periodically replacing older training samples with new randomly chosen samples.
翻译:我们研究用触摸手势生物测定方法对智能玻璃(Google Glass)用户进行隐含认证的可行性,方法是提议使用两个分类器,即:使用RBF内核的SVM和基于Chebyshev浓度不平等的新分类器。根据从30名志愿者收集到的数据,我们证明这种认证在分类精确度和智能玻璃计算负荷方面都是可行的。我们实现了高达99%的分类准确度,只有75个培训样本使用四种不同触摸手势的行为生物测定数据。为了显示我们的系统可以普遍化,我们测试了它从智能手机接触数据的性能,发现其准确性与智能玻璃相似。最后,我们关于手势持久性的实验表明,用时间改变用户行为对分类准确性的负面影响最好通过定期用新的随机抽样取代旧的培训样本来减轻。