Federated Learning (FL) has opened the opportunity for collaboratively training machine learning models on heterogeneous mobile or Edge devices while keeping local data private.With an increase in its adoption, a growing concern is related to its economic and environmental cost (as is also the case for other machine learning techniques).Unfortunately, little work has been done to optimize its energy consumption or emissions of carbon dioxide or equivalents, as energy minimization is usually left as a secondary objective.In this paper, we investigate the problem of minimizing the energy consumption of FL training on heterogeneous devices by controlling the workload distribution.We model this as the Minimal Cost FL Schedule problem, a total cost minimization problem with identical, independent, and atomic tasks that have to be assigned to heterogeneous resources with arbitrary cost functions.We propose a pseudo-polynomial optimal solution to the problem based on the previously unexplored Multiple-Choice Minimum-Cost Maximal Knapsack Packing Problem.We also provide four algorithms for scenarios where cost functions are monotonically increasing and follow the same behavior.These solutions are likewise applicable on the minimization of other kinds of costs, and in other one-dimensional data partition problems.
翻译:联邦学习联盟(FL)在保持本地数据私密的同时,为合作培训不同移动或边缘装置的机器学习模型提供了机会。 随着采用率的提高,人们越来越关注其经济和环境成本(其他机器学习技术也是如此)。 不幸的是,在优化能源消耗或二氧化碳或等量排放方面没有做多少工作,因为能源最小化通常留作次要目标。 在本文件中,我们调查通过控制工作量分配,最大限度地减少不同装置的FL培训的能源消耗问题。 我们将此作为最低成本FL时间表问题模型,这是一个完全成本最小化的问题,它涉及相同、独立和原子的任务,必须分配给具有任意成本功能的多种资源。 我们基于以前未探索的多冰最小度-最大最大最大最大Knappsack包装问题,提出了这一问题的假极化最佳解决办法。 我们还为成本功能单调增加和遵循相同行为的情景提供了四种算法。 这些解决办法同样适用于尽量减少其他成本,以及其他一维数据分割问题。