Federated Learning is a distributed machine learning environment, which ensures that clients complete collaborative training without sharing private data, only by exchanging parameters. However, the data does not satisfy the same distribution and the computing resources of clients are different, which brings challenges to the related research. To better solve the above heterogeneous problems, we designed a novel federated learning method. The local model consists of the pre-trained model as the backbone and fully connected layers as the head. The backbone can extract features for the head, and the embedding vector of classes is shared between clients to optimize the head so that the local model can perform better. By sharing the embedding vector of classes, instead of parameters based on gradient space, clients can better adapt to private data, and it is more efficient in the communication between the server and clients. To better protect privacy, we proposed a privacy-preserving hybrid method to add noise to the embedding vector of classes, which has less impact on the local model performance under the premise of satisfying differential privacy. We conduct a comprehensive evaluation with other federated learning methods on the self-built vehicle dataset under non-independent identically distributed(Non-IID)
翻译:联邦学习是一种分布式的机器学习环境,它确保客户完成合作培训而不分享私人数据,只能通过交换参数。然而,数据不能满足同样的分布,而客户的计算资源则不同,这给相关研究带来了挑战。为了更好地解决上述差异问题,我们设计了一个新的联合学习方法。当地模式包括作为主干线的预培训模式和作为头部的完全连接的层;骨干可以提取头部特征,而嵌入的班级矢量在客户之间共享,以优化头部,使本地模型发挥更好的效果。通过共享嵌入的班级矢量,而不是基于梯度空间的参数,客户可以更好地适应私人数据,而且这种系统在服务器和客户之间的通信中效率更高。为了更好地保护隐私,我们提议了一种隐私保护混合方法,以在满足差异隐私权的前提下,增加教室嵌入矢量的噪音,对本地模型性能影响较小。我们与其他联邦化学习方法一起,在不完全依赖的情况下,对自建车辆数据集进行全面评估。