TY - GEN
T1 - Power load forecasting based on VMD and attention-LSTM
AU - Chao, Han Chieh
AU - Lin, Fu
AU - Pan, Jeng Shyang
AU - Chien, Wei Che
AU - Lai, Chin Feng
N1 - Funding Information:
This work was supported in part by the Ministry of Science and Technology Program (Project No. MOST 107-2221-E-259-005-MY3 for funding).
Publisher Copyright:
© 2020 ACM.
PY - 2020/7/24
Y1 - 2020/7/24
N2 - Accurate forecasting of short-term load forecasting is of great help to demand-side response and power dispatching. In order to improve the accuracy of short-term power load prediction, the original power load data signals are decomposed by using the Variational Mode Decomposition (VMD) method. The decomposed sub-signals and the original signals form a new data set, which is then trained by the neural network. The decomposed sub-signals reflect the detailed features inside the power load that are difficult to be learned by the neural network. Through VMD analysis, the neural network can learn richer information, which is more effective than the superposition prediction method. The neural network prediction model selects an architecture based on Attention-long short term memory (Attention-LSTM). The addition of attention mechanism enables important decomposed information to be fully learned. The effectiveness of this method is proved by experiment.
AB - Accurate forecasting of short-term load forecasting is of great help to demand-side response and power dispatching. In order to improve the accuracy of short-term power load prediction, the original power load data signals are decomposed by using the Variational Mode Decomposition (VMD) method. The decomposed sub-signals and the original signals form a new data set, which is then trained by the neural network. The decomposed sub-signals reflect the detailed features inside the power load that are difficult to be learned by the neural network. Through VMD analysis, the neural network can learn richer information, which is more effective than the superposition prediction method. The neural network prediction model selects an architecture based on Attention-long short term memory (Attention-LSTM). The addition of attention mechanism enables important decomposed information to be fully learned. The effectiveness of this method is proved by experiment.
UR - http://www.scopus.com/inward/record.url?scp=85090906170&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090906170&partnerID=8YFLogxK
U2 - 10.1145/3414274.3414277
DO - 10.1145/3414274.3414277
M3 - Conference contribution
AN - SCOPUS:85090906170
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 3rd International Conference on Data Science and Information Technology, DSIT 2020
PB - Association for Computing Machinery
T2 - 3rd International Conference on Data Science and Information Technology, DSIT 2020
Y2 - 24 July 2020 through 26 July 2020
ER -