Many works have recently explored Sim-to-real transferable visual model predictive control (MPC). However, such works are limited to one-shot transfer, where real-world data must be collected once to perform the sim-to-real transfer, which remains a significant human effort in transferring the models learned in simulations to new domains in the real world. To alleviate this problem, we first propose a novel model-learning framework called Kalman Randomized-to-Canonical Model (KRC-model). This framework is capable of extracting task-relevant intrinsic features and their dynamics from randomized images. We then propose Kalman Randomized-to-Canonical Model Predictive Control (KRC-MPC) as a zero-shot sim-to-real transferable visual MPC using KRC-model. The effectiveness of our method is evaluated through a valve rotation task by a robot hand in both simulation and the real world, and a block mating task in simulation. The experimental results show that KRC-MPC can be applied to various real domains and tasks in a zero-shot manner.
翻译:许多作品最近探索了Sim-to- real可转让的视觉模型预测控制(MPC),然而,这类工程仅限于一次传输,其中必须收集一次真实世界数据才能进行模拟-真实传输,这仍然是人类在将模拟中学到的模型转换到现实世界新领域方面的一项重大努力。为了缓解这一问题,我们首先提议了一个名为Kalman随机到气候模型(KRC-model)的新型模型学习框架。这个框架能够从随机图像中提取任务相关的内在特征及其动态。我们然后建议Kalman随机到气候模型预测控制(KRC-MPC)作为使用 KRC- 模型的零弹出即真实可转让视觉控制(KRC-MPC) 。我们的方法的有效性通过模拟和真实世界中的机器人手的阀轮任务以及模拟中的块交配任务来评估。实验结果显示, KRC-MPC可以以零射方式应用于各种真实领域和任务。