Federated Learning is a widely adopted method to train neural networks over distributed data. One main limitation is the performance degradation that occurs when data is heterogeneously distributed. While many works have attempted to address this problem, these methods under-perform because they are founded on a limited understanding of neural networks. In this work, we verify that only certain important layers in a neural network require regularization for effective training. We additionally verify that Centered Kernel Alignment (CKA) most accurately calculates similarity between layers of neural networks trained on different data. By applying CKA-based regularization to important layers during training, we significantly improve performance in heterogeneous settings. We present FedCKA: a simple framework that out-performs previous state-of-the-art methods on various deep learning tasks while also improving efficiency and scalability.
翻译:联邦学习是一种对分布式数据进行神经网络培训的广泛采用的方法。主要限制之一是数据分布不一时产生的性能退化。虽然许多工作试图解决这一问题,但这些方法之所以表现不佳,是因为对神经网络的了解有限。在这项工作中,我们核实神经网络中只有某些重要层次需要正规化才能进行有效培训。我们进一步核实,中心内尔对齐(CKA)最准确地计算了不同数据培训的神经网络层之间的相似性。在培训过程中,通过对重要层次应用基于CKA的正规化,我们大大改进了多种环境的绩效。我们介绍了FCCKA:一个简单的框架,它比以往各种深层学习任务中最先进的方法要好,同时也提高了效率和可扩展性。