A NEW DESCENT MEMORY GRADIENT METHOD AND ITS GLOBAL CONVERGENCE
A NEW DESCENT MEMORY GRADIENT METHOD AND ITS GLOBAL CONVERGENCE作者机构:Department of MaShematics and Information Science Zaozhuan9 University Zaozhuang 277160 China. School of Management Qufu Normal University Rizhao 276826 China
出 版 物:《Journal of Systems Science & Complexity》 (系统科学与复杂性学报(英文版))
年 卷 期:2011年第24卷第4期
页 面:784-794页
核心收录:
学科分类:12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 070105[理学-运筹学与控制论] 0701[理学-数学]
基 金:supported by the National Science Foundation of China under Grant No.70971076 the Foundation of Shandong Provincial Education Department under Grant No.J10LA59
主 题:Global convergence memory gradient method sufficiently descent.
摘 要:In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.