咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >A NEW DESCENT MEMORY GRADIENT ... 收藏

A NEW DESCENT MEMORY GRADIENT METHOD AND ITS GLOBAL CONVERGENCE

A NEW DESCENT MEMORY GRADIENT METHOD AND ITS GLOBAL CONVERGENCE

作     者:Min SUN Qingguo BAI Min SUN Department of Mathematics and Information Science,Zaozhuang University,Zaozhuang 277160,China. Qingguo BAI School of Management,Qufu Normal University,Rizhao 276826,China.

作者机构:Department of MaShematics and Information Science Zaozhuan9 University Zaozhuang 277160 China. School of Management Qufu Normal University Rizhao 276826 China 

出 版 物:《Journal of Systems Science & Complexity》 (系统科学与复杂性学报(英文版))

年 卷 期:2011年第24卷第4期

页      面:784-794页

核心收录:

学科分类:0810[工学-信息与通信工程] 1205[管理学-图书情报与档案管理] 12[管理学] 1201[管理学-管理科学与工程(可授管理学、工学学位)] 07[理学] 070105[理学-运筹学与控制论] 0811[工学-控制科学与工程] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:supported by the National Science Foundation of China under Grant No.70971076 the Foundation of Shandong Provincial Education Department under Grant No.J10LA59 

主  题:Global convergence memory gradient method sufficiently descent. 

摘      要:In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分