Non-intrusive load monitoring (NILM) is essential for understanding customer's power consumption patterns and may find wide applications like carbon emission reduction and energy conservation. The training of NILM models requires massive load data containing different types of appliances. However, inadequate load data and the risk of power consumer privacy breaches may be encountered by local data owners during the NILM model training. To prevent such potential risks, a novel NILM method named Fed-NILM which is based on Federated Learning (FL) is proposed in this paper. In Fed-NILM, local model parameters instead of local load data are shared among multiple data owners. The global model is obtained by weighted averaging the parameters. Experiments based on two measured load datasets are conducted to explore the generalization ability of Fed-NILM. Besides, a comparison of Fed-NILM with locally-trained NILMs and the centrally-trained NILM is conducted. The experimental results show that Fed-NILM has superior performance in scalability and convergence. Fed-NILM outperforms locally-trained NILMs operated by local data owners and approximates the centrally-trained NILM which is trained on the entire load dataset without privacy protection. The proposed Fed-NILM significantly improves the co-modeling capabilities of local data owners while protecting power consumers' privacy.
翻译:非侵入性负载监测(NILM)对于了解客户的电力消费模式至关重要,并可能发现碳减排和节能等广泛应用。NILM模型的培训需要包含不同种类电器的大量载荷数据。然而,在NILM模型培训期间,当地数据所有人可能遇到负载数据不足和侵犯消费者隐私的风险。为了防止这种潜在风险,本文件提议了一个新的名为Fed-NILM(Fed-NILM)的NILM方法,该方法以Fed-NILM(FL)为基础。在Fed-NILM(FLM)中,当地模型参数而不是当地负载数据数据所有者共享的本地模型参数。通过加权平均参数获得全球模型。基于两个计量的负载数据集进行的实验旨在探索Fed-NILM(Fed-NILM)的总体能力。此外,还进行了Fed-NILM(Fed-NIL)与当地培训的NILM(Fed-NIM)和中央培训的NIL(M)拥有者隐私保护能力大大改进了当地数据的能力。