We consider the problem of optimizing the distribution operations at a drone hub that dispatches drones to different geographic locations generating stochastic demands for medical supplies. Drone delivery is an innovative method that introduces many benefits, such as low-contact delivery, thereby reducing the spread of pandemic and vaccine-preventable diseases. While we focus on medical supply delivery for this work, drone delivery is suitable for many other items, including food, postal parcels, and e-commerce. In this paper, our goal is to address drone delivery challenges related to the stochastic demands of different geographic locations. We consider different classes of demand related to geographic locations that require different flight ranges, which is directly related to the amount of charge held in a drone battery. We classify the stochastic demands based on their distance from the drone hub, use a Markov decision process to model the problem, and perform computational tests using realistic data representing a prominent drone delivery company. We solve the problem using a reinforcement learning method and show its high performance compared with the exact solution found using dynamic programming. Finally, we analyze the results and provide insights for managing the drone hub operations.
翻译:我们考虑了优化无人机中心分配作业的问题,无人机中心将无人机运送到不同的地理位置,产生对医疗用品的随机需求。无人机运送是一种创新方法,它带来许多好处,例如低接触交付,从而减少流行病和疫苗可预防疾病的传播。我们注重这项工作的医疗用品供应,无人机运送适合许多其他物品,包括食品、邮政包裹和电子商务。在本文件中,我们的目标是应对与不同地理位置的随机需求相关的无人机运送挑战。我们考虑与需要不同飞行范围的地理位置相关的不同需求类别,这与无人机电池的收费量直接相关。我们根据无人机中心的距离对随机机需求进行分类,使用Markov决策程序来模拟问题,并使用代表著名无人机运送公司的现实数据进行计算测试。我们使用强化学习方法解决问题,并展示其高性能与使用动态编程找到的精确解决方案相比。最后,我们分析了结果,为管理无人机枢纽操作提供了见解。