TinyML has rose to popularity in an era where data is everywhere. However, the data that is in most demand is subject to strict privacy and security guarantees. In addition, the deployment of TinyML hardware in the real world has significant memory and communication constraints that traditional ML fails to address. In light of these challenges, we present TinyFedTL, the first implementation of federated transfer learning on a resource-constrained microcontroller.
翻译:在数据无处不在的时代,TinyML越来越受欢迎,然而,最需要的数据受到严格的隐私和安全保障;此外,在现实世界部署TinyML硬件有传统ML未能解决的重大记忆和通信限制;鉴于这些挑战,我们向TyyFedTL介绍在资源限制的微控制器上首次实施联合转移学习。