上海交通大学学报 ›› 2024, Vol. 58 ›› Issue (6): 846-854.doi: 10.16183/j.cnki.jsjtu.2022.534

• 新型电力系统与综合能源 • 上一篇    下一篇

融合外部注意力机制的序列到点非侵入式负荷分解

李利娟1, 刘海1, 刘红良2(), 张青松1, 陈永东1   

  1. 1.湘潭大学 自动化与电子信息学院,湖南 湘潭 411105
    2.湖南国家应用数学中心,湖南 湘潭 411105
  • 收稿日期:2022-12-29 修回日期:2023-04-26 接受日期:2023-05-22 出版日期:2024-06-28 发布日期:2024-07-05
  • 通讯作者: 刘红良,教授,博士生导师;E-mail: lhl@xtu.edu.cn.
  • 作者简介:李利娟(1980-),教授,博士生导师,从事电力系统脆弱性研究.
  • 基金资助:
    国家自然科学基金项目(52077189);电力传输与功率变换控制教育部重点实验室开放课题(2021AA02)

Non-Intrusive Load Disaggregation Using Sequence-to-Point Integrating External Attention Mechanism

LI Lijuan1, LIU Hai1, LIU Hongliang2(), ZHANG Qingsong1, CHEN Yongdong1   

  1. 1. College of Automation and Electronic Information, Xiangtan University, Xiangtan 411105, Hunan, China
    2. National Center for Applied Mathematics in Hunan, Xiangtan 411105, Hunan, China
  • Received:2022-12-29 Revised:2023-04-26 Accepted:2023-05-22 Online:2024-06-28 Published:2024-07-05

摘要:

非侵入式负荷分解可以深度挖掘用户电力消耗数据蕴含的信息价值,为电力设备故障监测、需求响应等决策分析提供重要参考.为有效解决非侵入式负荷分解算法训练时间成本与分解精度间的冲突,提出一种融合外部注意力机制的序列到点非侵入式负荷分解算法.首先,将总负荷功率消耗序列进行数据清理、标准化等预处理,以固定窗口长度构建训练输入数据,输入数据通过编码层自动提取设备特征;然后,设计外部注意力机制增强重要特征权值;最终,输入到解码层得到负荷分解结果.利用REDD与UK-DALE两种公开数据集进行模型仿真计算,在信号聚合误差、平均绝对误差、标准化分解误差指标、模型分解曲线、特征图和用户耗能等方面进行对比分析,本文模型克服了卷积层注意力分散的缺点,增强了对有效信息的提取与利用能力,在未增加训练时间成本的前提下具有更高的分解精度.

关键词: 非侵入式负荷分解, 外部注意力机制, 神经网络, 序列到点

Abstract:

Non-intrusive load disaggregation (NILD) can deeply explore the value of customer power consumption data, providing an important reference for decision analysis such as power equipment fault monitoring and demand response. Aimed at the conflict between the training time and the accuracy of non-intrusive load disaggregation, a non-intrusive load disaggregation algorithm using sequence-to-point integrating external attention (EA) mechanism is proposed. First, the original data is pre-processed by data purification, normalization, and some other operations, and the train data is built with a same length window. The equipment feature is extracted through the encoder layer. Then, the feature weights of important parts are enhanced by introducing an external attention mechanism. Finally, the results are yielded through the decoder layer. Simulation calculation of the proposed model and the current mainstream model is performed using the publicly available datasets, REDD and UK-DALE, while the indicators of signal aggregate error, mean absolute error, normalized disaggregation error, model disaggregation curves, feature map, and user energe consumption are compared and analyzed. The proposed model overcomes the shortcomings of attention scattering in the convolutional layer, enhances the ability to extract and utilize effective information, and has a more accurate decomposition accuracy without increasing the training time cost.

Key words: non-intrusive load disaggregation (NILD), external attention (EA) mechanism, neural networks, sequence-to-point (Seq2Point)

中图分类号: