Shared micromobility systems, such as electric scooters and bikes, have gained widespread popularity as sustainable alternatives to traditional transportation modes. However, these systems face persistent challenges due to spatio-temporal demand fluctuations, often resulting in a mismatch between vehicle supply and user demand. Existing shared micromobility vehicle scheduling methods typically redistribute vehicles once or twice per day, which makes them vulnerable to performance degradation under atypical conditions. In this work, we design to augment existing micromobility scheduling methods by integrating a small number of autonomous shared micromobility vehicles (ASMVs), which possess self-rebalancing capabilities to dynamically adapt to real-time demand. Specifically, we introduce SMART, a hierarchical reinforcement learning framework that jointly optimizes high-level initial deployment and low-level real-time rebalancing for ASMVs. We evaluate our framework based on real-world e-scooter usage data from Chicago. Our experiment results show that our framework is highly effective and possesses strong generalization capability, allowing it to seamlessly integrate with existing vehicle scheduling methods and significantly enhance overall micromobility service performance.
翻译:暂无翻译