Advances in low-power electronics and machine learning techniques lead to many novel wearable IoT devices. These devices have limited battery capacity and computational power. Thus, energy harvesting from ambient sources is a promising solution to power these low-energy wearable devices. They need to manage the harvested energy optimally to achieve energy-neutral operation, which eliminates recharging requirements. Optimal energy management is a challenging task due to the dynamic nature of the harvested energy and the battery energy constraints of the target device. To address this challenge, we present a reinforcement learning-based energy management framework, tinyMAN, for resource-constrained wearable IoT devices. The framework maximizes the utilization of the target device under dynamic energy harvesting patterns and battery constraints. Moreover, tinyMAN does not rely on forecasts of the harvested energy which makes it a prediction-free approach. We deployed tinyMAN on a wearable device prototype using TensorFlow Lite for Micro thanks to its small memory footprint of less than 100 KB. Our evaluations show that tinyMAN achieves less than 2.36 ms and 27.75 $\mu$J while maintaining up to 45% higher utility compared to prior approaches.
翻译:低功率电子和机器学习技术的进步导致许多新颖的可磨损 IoT 装置。 这些装置的电池容量和计算力有限。 因此, 从环境来源收集能源是给这些低能耗损装置供电的一个很有希望的解决办法。 它们需要以最佳方式管理所收获的能源,以实现能源中和操作,从而消除再充电要求。 最佳能源管理是一项具有挑战性的任务,因为所收获的能源具有动态性,而且目标装置的电池能量受限。 为了应对这一挑战,我们提出了一个强化学习的能源管理框架,即小MAN,用于资源限制的可磨损性IoT装置。 这个框架最大限度地利用了在动态能源收集模式和电池限制下的目标装置。 此外,小MAN并不依赖对所收获的能源的预测,而这种预测是没有预测的方法。 我们把小型MAN放在一个耗损装置的原型上,使用TensorFlow Lite用于微小的微型设备,因为其记忆足迹不到100 KB。 我们的评估显示,小MAN的功率低于2.36 mm和27.75 muJ,同时保持最高功率,而比以前的做法还高达45%。