基于Wei-Yao-Liu共轭梯度参数的修正共轭梯度算法
发布时间:2018-03-16 05:15
本文选题:无约束优化 切入点:共轭梯度法 出处:《重庆师范大学》2015年硕士论文 论文类型:学位论文
【摘要】:最优化是一门应用广泛、发展迅速的学科,而无约束优化问题是最优化问题的基础。最基本的无约束优化方法包括最速下降法、牛顿法、共轭梯度法、拟牛顿法。本文主要考虑求解大规模无约束优化问题的共轭梯度法,它具有所需存储量小、算法简单又易于编程等特点,是最优化中最常用的方法之一。它在航天航空、石油勘探、大气模拟和工程设计等领域都有广泛的应用。本论文在国内外研究成果的基础上进行深入思考,对共轭梯度算法做了大量研究,主要研究结果归纳如下:第一章简单介绍了几种常见的求解无约束优化问题的最优化方法,并对共轭梯度法的相关知识进行了简要介绍。第二章基于VPRP共轭梯度法提出了几种带干扰因子的修正共轭梯度法,证明了这些新方法在一般线搜索,如广义Wolfe线搜索等条件下具有充分下降性和全局收敛性。数值试验结果表明这些新方法是有效的。第三章分别针对PRP和FR,HS和DY,LS和CD共轭梯度法各自的优缺点,结合对共轭梯度参数作非负限制和混合共轭梯度法的思想,提出了三个新的混合参数公式,证明了这些新参数对应的共轭梯度法在强Wolfe线搜索下具有充分下降性和全局收敛性。数值试验结果表明这些新方法是有效的。第四章充分利用各共轭参数中因子的作用,提出了一个新的混合参数公式,进而得到了一种不依赖任何线搜索就具有充分下降性的共轭梯度法。文章证明了新方法在标准Wolfe线搜索下全局收敛,数值试验结果表明新方法是有效的。
[Abstract]:Optimization is a widely used and rapidly developing discipline, and unconstrained optimization problems are the basis of optimization problems. The most basic unconstrained optimization methods include the steepest descent method, Newton method, conjugate gradient method. In this paper, the conjugate gradient method for solving large-scale unconstrained optimization problems is mainly considered. It has the characteristics of small storage, simple algorithm and easy programming. It is one of the most commonly used methods in optimization. Oil exploration, atmospheric simulation and engineering design have been widely used. Based on the domestic and foreign research results, this paper has done a lot of research on conjugate gradient algorithm. The main results are summarized as follows: in Chapter 1, several common optimization methods for unconstrained optimization problems are briefly introduced. In the second chapter, based on VPRP conjugate gradient method, several modified conjugate gradient methods with interference factors are proposed, and it is proved that these new methods search in general line. For example, the generalized Wolfe line search has sufficient descent and global convergence. The numerical results show that these new methods are effective. In chapter 3, the advantages and disadvantages of PRP, FRHs and DYLS and CD conjugate gradient methods are discussed, respectively. Combined with the idea of nonnegative restriction on conjugate gradient parameters and mixed conjugate gradient method, three new mixed parameter formulas are proposed. It is proved that the conjugate gradient method corresponding to these new parameters has sufficient descent and global convergence under strong Wolfe line search. The numerical results show that these new methods are effective. Chapter 4th makes full use of the effects of the factors in the conjugate parameters. In this paper, a new mixed parameter formula is proposed, and then a conjugate gradient method with sufficient descent without relying on any line search is obtained. It is proved that the new method converges globally under the standard Wolfe line search. The numerical results show that the new method is effective.
【学位授予单位】:重庆师范大学
【学位级别】:硕士
【学位授予年份】:2015
【分类号】:O224
【参考文献】
相关期刊论文 前6条
1 杜学武,徐成贤;由FR共轭梯度法控制的两类优化算法的全局收敛性[J];高等学校计算数学学报;2000年04期
2 黄海;林穗华;姚胜伟;;一个基于LS公式修正的新共轭梯度算法[J];广西科学;2007年03期
3 ;GLOBAL CONVERCENCE OF THE FLETCHER-REEVES ALGORITHM WITH INEXACT LINESEARCH[J];Applied Mathematics:A Journal of Chinese Universities(Series B);1995年01期
4 林穗华;;求解无约束优化问题的新谱共轭梯度法及其收敛性[J];经济数学;2013年04期
5 江羡珍;简金宝;马国栋;;具有充分下降性的两个共轭梯度法[J];数学学报;2014年02期
6 杜学武,徐成贤,凌永祥;由 FR 共轭梯度法控制的下降算法的全局收敛性[J];西安交通大学学报;1998年06期
,本文编号:1618483
本文链接:https://www.wllwen.com/kejilunwen/yysx/1618483.html