We present a system for accurately predicting stable orientations for diverse rigid objects. We propose to overcome the critical issue of modelling multimodality in the space of rotations by using a conditional generative model to accurately classify contact surfaces. Our system is capable of operating from noisy and partially-observed pointcloud observations captured by real world depth cameras. Our method substantially outperforms the current state-of-the-art systems on a simulated stacking task requiring highly accurate rotations, and demonstrates strong sim2real zero-shot transfer results across a variety of unseen objects on a real world reorientation task. Project website: \url{https://richardrl.github.io/stable-reorientation/}
翻译:我们提出了一个精确预测各种僵硬物体稳定方向的系统。我们提议通过使用一个有条件的基因模型对接触表面进行准确分类来克服在旋转空间模拟多式联运这一关键问题。我们的系统能够从真实世界深度摄像头所捕捉的噪音和部分观测点球状观测中运作。我们的方法大大优于模拟堆叠任务中目前最先进的系统,需要非常精确的旋转,并展示了在现实世界方向调整任务中各种看不见物体之间强烈的仿真零弹转移结果。项目网站:\url{https://richardrl.github.io/stable-repoint/}