CSpace
Long short-term memory with activation on gradient
Qin, Chuan1,2; Chen, Liangming3,4; Cai, Zangtai2; Liu, Mei1,2; Jin, Long3
2023-07-01
摘要As the number of long short-term memory (LSTM) layers increases, vanishing/exploding gradient problems exacerbate and have a negative impact on the performance of the LSTM. In addition, the ill-conditioned problem occurs in the training process of LSTM and adversely affects its convergence. In this work, a simple and effective method of the gradient activation is applied to the LSTM, while empirical criteria for choosing gradient activation hyperparameters are found. Activating the gradient refers to modifying the gradient with a specific function named the gradient activation function. Moreover, different activation functions and different gradient operations are compared to prove that the gradient activation is effective on LSTM. Furthermore, comparative experiments are conducted, and their results show that the gradient activation alleviates the above problems and accelerates the convergence of the LSTM. The source code is publicly available at https://github.com/LongJin-lab/ACT-In-NLP. & COPY; 2023 Elsevier Ltd. All rights reserved.
关键词Long short -term memory (LSTM) Gradient activation Vanishing gradient problem Exploding gradient problem Ill-conditioned problem
DOI10.1016/j.neunet.2023.04.026
发表期刊NEURAL NETWORKS
ISSN0893-6080
卷号164页码:135-145
通讯作者Liu, Mei(liumeisysu@qq.com) ; Jin, Long(jinlongsysu@foxmail.com)
收录类别SCI
WOS记录号WOS:001054169800001
语种英语