Safety assurance is critical in the planning and control of robotic systems. For robots operating in the real world, the safety-critical design often needs to explicitly address uncertainties and the pre-computed guarantees often rely on the assumption of the particular distribution of the uncertainty. However, it is difficult to characterize the actual uncertainty distribution beforehand and thus the established safety guarantee may be violated due to possible distribution mismatch. In this paper, we propose a novel safe control framework that provides a high-probability safety guarantee for stochastic dynamical systems following unknown distributions of motion noise. Specifically, this framework adopts adaptive conformal prediction to dynamically quantify the prediction uncertainty from online observations and combines that with the probabilistic extension of the control barrier functions (CBFs) to characterize the uncertainty-aware control constraints. By integrating the constraints in the model predictive control scheme, it allows robots to adaptively capture the true prediction uncertainty online in a distribution-free setting and enjoys formally provable high-probability safety assurance. Simulation results on multi-robot systems with stochastic single-integrator dynamics and unicycle dynamics are provided to demonstrate the effectiveness of our framework.
翻译:暂无翻译