KMS Chongqing Institute of Green and Intelligent Technology, CAS
Adjusted stochastic gradient descent for latent factor analysis | |
Li, Qing1,2; Xiong, Diwen1; Shang, Mingsheng1 | |
2022-04-01 | |
摘要 | A high-dimensional and incomplete (HDI) matrix is a common form of big data in most industrial applications. Stochastic gradient descent (SGD) algorithm optimized latent factor analysis (LFA) model is often adopted in learning the abundant knowledge in HDI matrix. Despite its computational tractability and scalability, when solving a bilinear problem such as LFA, the regular SGD algorithm tends to be stuck in a local optimum. To address this issue, the paper innovatively proposes an Adjusted Stochastic Gradient Descent (ASGD) for Latent Factor Analysis, where the adjustment mechanism is implemented by considering the bi-polar gradient directions during optimization, such mechanism is theoretically proved for its efficiency in overstepping local saddle points and avoiding premature convergence. Also, the hyper-parameters of the model are implemented in a self-adaptive manner using the particle swarm optimization (PSO) algorithm, for higher practicality. Experimental results show that the proposed model outperforms other state-of-the-art approaches on six different HDI matrices from industrial applications, especially in prediction accuracy for missing data.(c) 2021 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). |
关键词 | Big data analysis High-dimensional and incomplete matrix Stochastic gradient descent Latent factor analysis Gradient adjustment Adaptive model Particle swarm optimization Local optima |
DOI | 10.1016/j.ins.2021.12.065 |
发表期刊 | INFORMATION SCIENCES |
ISSN | 0020-0255 |
卷号 | 588页码:196-213 |
通讯作者 | Shang, Mingsheng(msshang@cigit.ac.cn) |
收录类别 | SCI |
WOS记录号 | WOS:000768300300011 |
语种 | 英语 |