咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >An Accelerated Stochastic Mirr... 收藏

An Accelerated Stochastic Mirror Descent Method

作     者:Bo-Ou Jiang Ya-Xiang Yuan 

作者机构:LSECICMSECAMSSChinese Academy of SciencesBeijing100190China Department of MathematicsUniversity of Chinese Academy of SciencesBeijing100049China 

出 版 物:《Journal of the Operations Research Society of China》 (中国运筹学会会刊(英文))

年 卷 期:2024年第12卷第3期

页      面:549-571页

核心收录:

学科分类:07[理学] 0701[理学-数学] 070101[理学-基础数学] 

基  金:National Natural Science Foundation of China, NSFC, (1228201) National Natural Science Foundation of China, NSFC 

主  题:Large-scale optimization Variance reduction Mirror descent Acceleration Independent sampling Importance sampling 

摘      要:Driven by large-scale optimization problems arising from machine learning,the development of stochastic optimization methods has witnessed a huge *** types of methods have been developed based on vanilla stochastic gradient descent ***,for most algorithms,convergence rate in stochastic setting cannot simply match that in deterministic *** understanding the gap between deterministic and stochastic optimization is the main goal of this ***,we are interested in Nesterov acceleration of gradient-based *** our study,we focus on acceleration of stochastic mirror descent method with implicit regularization *** that the problem objective is smooth and convex or strongly convex,our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分