CSpace
Convergence Analysis of Single Latent Factor-Dependent, Nonnegative, and Multiplicative Update-Based Nonnegative Latent Factor Models
Liu, Zhigang1,2,3; Luo, Xin2,3,4; Wang, Zidong5
2021-04-01
摘要A single latent factor (LF)-dependent, nonnegative, and multiplicative update (SLF-NMU) learning algorithm is highly efficient in building a nonnegative LF (NLF) model defined on a high-dimensional and sparse (HiDS) matrix. However, convergence characteristics of such NLF models are never justified in theory. To address this issue, this study conducts rigorous convergence analysis for an SLF-NMU-based NLF model. The main idea is twofold: 1) proving that its learning objective keeps nonincreasing with its SLF-NMU-based learning rules via constructing specific auxiliary functions; and 2) proving that it converges to a stable equilibrium point with its SLF-NMU-based learning rules via analyzing the Karush-Kuhn-Tucker (KKT) conditions of its learning objective. Experimental results on ten HiDS matrices from real applications provide numerical evidence that indicates the correctness of the achieved proof.
关键词Manganese Convergence Computational modeling Learning systems Analytical models Sparse matrices Big Data Big data convergence high-dimensional and sparse (HiDS) matrix latent factor (LF) analysis learning system neural networks nonnegative LF (NLF) analysis single LF-dependent nonnegative and multiplicative update (SLF-NMU)
DOI10.1109/TNNLS.2020.2990990
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
卷号32期号:4页码:1737-1749
通讯作者Luo, Xin(luoxin21@cigit.ac.cn)
收录类别SCI
WOS记录号WOS:000637534200027
语种英语