TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES
作者机构:Centro de Investigacion en MatematicasCIMAT A.C.GuanajuatoGto.Mexico
出 版 物:《Journal of Computational Mathematics》 (计算数学(英文))
年 卷 期:2021年第39卷第3期
页 面:375-391页
核心收录:
学科分类:07[理学] 0701[理学-数学] 070101[理学-基础数学]
基 金:supported in part by CONACYT(Mexico) Grants 258033 256126
主 题:Gradient methods Convex quadratic optimization Hessian spectral properties Steplength selection
摘 要:In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization *** proposed step sizes employ second-order information in order to obtain faster gradient-type *** step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function.A convergence analysis of the proposed algorithm is *** numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the ***,it is observed that our proposals accelerate the gradient method at nearly no extra computational cost,which makes our proposal a good alternative to solve large-scale problems.