咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >An Improved Non-negative Laten... 收藏
An Improved Non-negative Latent Factor Model via Momentum-Ba...

An Improved Non-negative Latent Factor Model via Momentum-Based Additive Gradient Descent Method

作     者:Ming Li Yan Song Chongjing Wang Yanfang Chu 

作者单位:Department of Control Science and EngineeringUniversity of Shanghai for Science and Technology Informatization and Industrialization Integration Research Institutethe China Academy of Information and Communications Technology Shanghai Marine Equipment Research Institute 

会议名称:《第40届中国控制会议》

会议日期:2021年

学科分类:08[工学] 081202[工学-计算机软件与理论] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

关 键 词:Nonnegative latent factor(NLF) model Symmetric,high-dimensional and sparse(SHiDS)matrices Single NLF-dependent and multiplicative update(SNL-MU) learning method Momentum-based additive gradient descent(MAGD) method 

摘      要:Nonnegative latent factor(NLF) model has a great ability of acquiring useful knowledge from symmetric,highdimensional,sparse(SHiDS) ***,the tradition NLF model in terms of the double factorization(DF) technique is somewhat conservative due to the inadequate consideration of difference cases of *** order to address this issue,we propose an improved DF-SNLF(IDF-SNLF) model to extract NLF matrices in various ***,single NLF-dependent and multiplicative update(SNL-MU) learning method is employed to build an NLF model on SHiDS matrices,yet it suffers from a fairly low convergence ***,for the purpose of accelerating the convergence rate,a so-called momentum-based additive gradient descent(MAGD) method is adopted to train the *** studies on two SHiDS matrices demonstrate that our proposed IDF-SNLF model with MAGD can obtain a desirable performance on both prediction accuracy for missing data and the algorithm convergence rate.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分