An Improved Non-negative Latent Factor Model via Momentum-Based Additive Gradient Descent Method
作者单位:Department of Control Science and EngineeringUniversity of Shanghai for Science and Technology Informatization and Industrialization Integration Research Institutethe China Academy of Information and Communications Technology Shanghai Marine Equipment Research Institute
会议名称:《第40届中国控制会议》
会议日期:2021年
学科分类:08[工学] 081202[工学-计算机软件与理论] 0812[工学-计算机科学与技术(可授工学、理学学位)]
关 键 词:Nonnegative latent factor(NLF) model Symmetric,high-dimensional and sparse(SHiDS)matrices Single NLF-dependent and multiplicative update(SNL-MU) learning method Momentum-based additive gradient descent(MAGD) method
摘 要:Nonnegative latent factor(NLF) model has a great ability of acquiring useful knowledge from symmetric,highdimensional,sparse(SHiDS) ***,the tradition NLF model in terms of the double factorization(DF) technique is somewhat conservative due to the inadequate consideration of difference cases of *** order to address this issue,we propose an improved DF-SNLF(IDF-SNLF) model to extract NLF matrices in various ***,single NLF-dependent and multiplicative update(SNL-MU) learning method is employed to build an NLF model on SHiDS matrices,yet it suffers from a fairly low convergence ***,for the purpose of accelerating the convergence rate,a so-called momentum-based additive gradient descent(MAGD) method is adopted to train the *** studies on two SHiDS matrices demonstrate that our proposed IDF-SNLF model with MAGD can obtain a desirable performance on both prediction accuracy for missing data and the algorithm convergence rate.