A NEW REDUCED GRADIENT METHOD
A NEW REDUCED GRADIENT METHOD作者机构:Institute of Mathematics Academia Sinica
出 版 物:《Science in China,Ser.A》 (中国科学A辑(英文版))
年 卷 期:1979年第10期
页 面:1099-1113页
学科分类:07[理学] 0701[理学-数学] 070101[理学-基础数学]
主 题:A NEW REDUCED GRADIENT METHOD
摘 要:In this paper we shall give a reduced gradient method and its convergence properties. The main results obtained are as follows:(ⅰ) If the objective function f is continuously differentiable, and the constraints are non-degenerate, then by starting from any feasible point, the iterative sequence {x~k} generated by the method either terminates after a finite number of iterations, or is such that every cluster point of the sequence is a K.-T. point.(ⅱ) If {x~k} is a convergent sequence, then the pivoting operations in the running of the algorithm occur only a finite number of times, i.e. after a finite number of iterations, the pivoting operation remains unchanged.(ⅲ) If the sequence {x~k} converges to x~*, which has strictly complementary slackness property,then we have x_j~k=0,for {x~k}, except for a finite number of k.(ⅳ) If f has a second continuous derivative, and its Hessian is uniformly positive definite, then {x~k} must converge to an optimal solution.