Adaptive control for real-time manipulation requires quick estimation and prediction of object properties. While robot learning in this area primarily focuses on using vision, many tasks cannot rely on vision due to object occlusion. Here, we formulate a learning framework that uses multimodal sensory fusion of tactile and audio data in order to quickly characterize and predict an object's properties. The predictions are used in a developed reactive controller to adapt the grip on the object to compensate for the predicted inertial forces experienced during motion. Drawing inspiration from how humans interact with objects, we propose an experimental setup from which we can understand how to best utilize different sensory signals and actively interact with and manipulate objects to quickly learn their object properties for safe manipulation.
翻译:实时操纵的适应性控制要求快速估计和预测天体属性。 虽然机器人在这一领域学习主要侧重于使用视觉, 但由于天体隔离, 许多任务无法依赖视觉。 在此, 我们制定学习框架, 使用多式感官聚合和音频数据快速描述和预测天体属性。 这些预测用于一个发达的反应控制器, 以调整天体的控制器, 以适应在运动期间所经历的预测惯性力。 我们从人类与天体互动的灵感中汲取灵感, 我们提议一个实验设置, 藉此我们可以理解如何最好地利用不同的感官信号, 并积极与天体互动和操作, 以快速了解天体特性, 以便安全操作 。