咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >TWO NOVEL GRADIENT METHODS WIT... 收藏

TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES

作     者:Harry Oviedo Oscar Dalmau Rafael Herrera Harry Oviedo;Oscar Dalmau;Rafael Herrera

作者机构:Centro de Investigacion en MatematicasCIMAT A.C.GuanajuatoGto.Mexico 

出 版 物:《Journal of Computational Mathematics》 (计算数学(英文))

年 卷 期:2021年第39卷第3期

页      面:375-391页

核心收录:

学科分类:07[理学] 0714[理学-统计学(可授理学、经济学学位)] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 070101[理学-基础数学] 

基  金:supported in part by CONACYT(Mexico) Grants 258033 256126 

主  题:Gradient methods Convex quadratic optimization Hessian spectral properties Steplength selection 

摘      要:In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization *** proposed step sizes employ second-order information in order to obtain faster gradient-type *** step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function.A convergence analysis of the proposed algorithm is *** numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the ***,it is observed that our proposals accelerate the gradient method at nearly no extra computational cost,which makes our proposal a good alternative to solve large-scale problems.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分