咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >LDformer:a parallel neural net... 收藏

LDformer:a parallel neural network model for long-term power forecasting

[Ldformer: 面向长期电力预测的并行神经网络模型]

作     者:Ran TIAN Xinmei LI Zhongyu MA Yanxing LIU Jingxia WANG Chu WANG Ran TIAN;Xinmei LI;Zhongyu MA;Yanxing LIU;Jingxia WANG;Chu WANG

作者机构:College of Computer Science&EngineeringNorthwest Normal UniversityLanzhou 730070China 

出 版 物:《信息与电子工程前沿:英文版》 (Frontiers of Information Technology & Electronic Engineering)

年 卷 期:2023年第24卷第9期

页      面:1287-1301页

核心收录:

学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 081104[工学-模式识别与智能系统] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:Project supported by the National Natural Science Foundation of China(No.71961028) the Key Research and Development Program of Gansu Province,China(No.22YF7GA171) the University Industry Support Program of Gansu Province,China(No.2023QB-115) the Innovation Fund for Science and Technology-based Small and Medium Enterprises of Gansu Province,China(No.23CXGA0136) the Scientific Research Project of the Lanzhou Science and Technology Program,China(No.2018-01-58) 

主  题:Long-term power forecasting Long short-term memory(LSTM) UniDrop Self-attention mechanism 

摘      要:Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system’s reliable power supply and the grid economy’s reliable ***,most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of *** address this challenge,we propose a parallel time-series prediction model called ***,we combine Informer with long short-term memory(LSTM)to obtain deep representation abilities in the time ***,we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention ***,we propose a probabilistic sparse(ProbSparse)self-attention mechanism combined with UniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the *** results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分