This paper presents BURG-Toolkit, a set of open-source tools for Benchmarking and Understanding Robotic Grasping. Our tools allow researchers to: (1) create virtual scenes for generating training data and performing grasping in simulation; (2) recreate the scene by arranging the corresponding objects accurately in the physical world for real robot experiments, supporting an analysis of the sim-to-real gap; and (3) share the scenes with other researchers to foster comparability and reproducibility of experimental results. We explain how to use our tools by describing some potential use cases. We further provide proof-of-concept experimental results quantifying the sim-to-real gap for robot grasping in some example scenes. The tools are available at: https://mrudorfer.github.io/burg-toolkit/
翻译:本文介绍BURG-Toolkit,这是一套用于衡量和理解机器人引力的基准和理解的开放源码工具。我们的工具使研究人员能够:(1) 为生成培训数据和在模拟中进行捕捉而创造虚拟场景;(2) 为真正的机器人实验在物理世界中准确安排相应的物体,以支持对模拟到现实差距的分析,从而重新创造场景;(3) 与其他研究人员分享场景,以促进可比性和实验结果的可复制性。我们通过描述一些潜在用途案例来解释如何使用我们的工具。我们进一步提供概念性实验结果的证明,以量化机器人在某些实例场景中捕捉的模拟到现实差距。这些工具可以在https://mrudorfer.github.io/burg-toolkit/上查阅。