基于韦增欣等的共轭梯度参数的修正共轭梯度算法
发布时间:2018-06-19 02:54
本文选题:非精确线搜索 + 共轭梯度法 ; 参考:《重庆师范大学》2015年硕士论文
【摘要】:本文基于韦增欣等的共轭梯度参数,提出一些修正共轭梯度算法,建立了这些算法的收敛性定理,并通过大量的数值试验检验所提出的算法的有效性.第一章介绍了非线性共轭梯度法的基本知识,一些已有的共轭梯度法全局收敛性的结果,本文的主要工作以及一些重要引理和假设.第二章,基于韦增欣等的共轭梯度参数,我们提出了四个修正的非线性共轭梯度法,分别称为NVLS,NVPRP*,NVHS*以及NVLS*方法.在强Wolfe线搜索条件且下证明了NVLS方法的充分下降性和全局收敛性;在强Wolfe线搜索条件且下证明了NVPRP*方法的充分下降性和全局收敛性;在强Wolfe线搜索条件且下证明了NVHS*方法的充分下降性和全局收敛性;在强Wolfe线搜索条件且下证明了NVLS*方法的充分下降性和全局收敛性.数值结果表明,NVPRP*方法优于NVPRP方法,NVHS*方法优于NVHS方法,NVLS*方法优于NVLS方法.第三章,我们提出了一个双参数的共轭梯度法簇,称为THCG*方法,它可被看作是作者在本文第二章中提出的NVPRP*,NVHS*和NVLS*方法的凸组合.在强Wolfe线搜索条件下证明了THCG*方法的充分下降性和全局收敛性.数值结果表明,THCG*方法虽然略差于NVHS*方法,但是比PRP,NVPRP*,NVLS*,THCG方法都好.第四章,我们分别对第二章的NVPRP*,NVHS*和NVLS*方法进行修正,提出MDPRP*.MDHS*,MDLS*方法.当μ≥0时,在强Wolfe线搜索条件且证明了NVPRP*方法的充分下降性和全局收敛性;在强Wolfe线搜索条件且下证明了NVHS*方法的充分下降性和全局收敛性;在强Wolfe线搜索条件且下,证明了NVLS*方法的充分下降性和全局收敛性.当μ2时,在Wolfe线搜索条件下分别证明了NVPRP*,NVHS*,NVLS*方法的充分下降性和全局收敛性.数值结果表明,MDPRP*方法优于NVPRP*方法,MDHS*方法优于NVHS*方法,MDLS*方法优于NVLS*方法.第五章,我们提出了一个修正的Dai-Liao共轭梯度法,称为MDL*方法.在强Wolfe线搜索条件且下证明了MDL*方法的充分下降性和全局收敛性.数值结果表明,MDL*方法优于PRP,DL,MDL,NVHS*及DY方法.
[Abstract]:In this paper, based on the conjugate gradient parameters of Wei Zengxin and others, some modified conjugate gradient algorithms are proposed, their convergence theorems are established, and the validity of the proposed algorithm is verified by a large number of numerical tests. In the first chapter, we introduce the basic knowledge of nonlinear conjugate gradient method, some results of global convergence of conjugate gradient method, the main work of this paper, and some important Lemma and assumptions. In the second chapter, based on the conjugate gradient parameters of Wei Zengxin and others, we propose four modified nonlinear conjugate gradient methods, which are called NVLS NVPRPU NVHS* and NVLS * methods. The sufficient descent and global convergence of NVLS method are proved under strong Wolfe line search condition, and the sufficient descent and global convergence of NVPRP * method are proved under strong Wolfe line search condition. The sufficient descent and global convergence of NVHS * method are proved under strong Wolfe line search condition, and the sufficient descent and global convergence of NVHS * method are proved under strong Wolfe line search condition. Numerical results show that the NVPRP * method is superior to the NVPRP method and the NVHS* method is superior to the NVHS method and the NVLS* method is superior to the NVLS method. In Chapter 3, we propose a two-parameter conjugate gradient method, called THCG * method, which can be regarded as a convex combination of NVPRPU NVHS* and NVLS* methods proposed by the author in Chapter 2. The sufficient descent and global convergence of the THCG * method are proved under the condition of strong Wolfe line search. The numerical results show that the THCG * method is slightly inferior to the NVHS * method, but it is better than the PRP NVPRP / NVLPRP / THCG method. In Chapter 4, we revise the NVPRPU NVHS* and NVLS* method in Chapter 2, and propose MDPRPU. MDHSN MDLS* method. When 渭 鈮,
本文编号:2038065
本文链接:https://www.wllwen.com/kejilunwen/yysx/2038065.html