Recent years, researchers focused on personalized federated learning (pFL) to address the inconsistent requirements of clients causing by data heterogeneity in federated learning (FL). However, existing pFL methods typically assume that local data distribution remains unchanged during FL training, the changing data distribution in actual heterogeneous data scenarios can affect model convergence rate and reduce model performance. In this paper, we focus on solving the pFL problem under the situation where data flows through each client like a flowing stream which called Flowing Data Heterogeneity under Restricted Storage, and shift the training goal to the comprehensive performance of the model throughout the FL training process. Therefore, based on the idea of category decoupling, we design a local data distribution reconstruction scheme and a related generator architecture to reduce the error of the controllable replayed data distribution, then propose our pFL framework, pFedGRP, to achieve knowledge transfer and personalized aggregation. Comprehensive experiments on five datasets with multiple settings show the superiority of pFedGRP over eight baseline methods.
翻译:暂无翻译