求解无约束优化问题的混合共轭梯度算法
发布时间:2018-06-09 14:20
本文选题:无约束优化 + 混合共轭梯度法 ; 参考:《西南大学》2017年硕士论文
【摘要】:共轭梯度法是求解大规模无约束优化问题的一类常用的而且十分有效的迭代算法,相比于Newton法和拟Newton法,它的显著优点是算法简单和存储空间小.众所周知,经典的共轭梯度法中,不同的方法其全局收敛性和数值表现也有所不同.自然地,很多学者试着构造一种既具有良好的全局收敛性又具有优秀的数值表现的新算法.一种想法,是直接对经典的共轭参数βk进行改良.另一种想法,就是将收敛性好和数值计算性能优良的共轭参数βk进行有效混合.本文主要考虑通过后一种想法来构造新算法.最近,学者们提出了一些混合共轭梯度法并得到了一些好的成果.受他们的启发,本文提出了两类新的混合共轭梯度法,分析 了其性质和全局收敛性,并给出了大量的数值结果.其主要成果如下:1.受Dai 和 Wen(Applied Mathematics and Computation,2012,218(14):7421-7430.),Jian 等(Applied Mathematical Modelling,2015,39(3):1281-1290.)和Wei 等(Applied Mat,hematics and Computation,2006,183(2):1341-1350.)的启发,本文提出了 NHC法,并给出了新共轭参数βkNHC的计算公式.共轭参数βkNHC具有性质(?)无论采用何种线搜索策略,NHC法在每一步迭代过程中都能生成一个充分下降方向.而且,在标准的Wolfe线搜索条件下,提出的算法能全局收敛.最后,我们做了大量的数值实验.数值结果也说明了提出的算法具有良好的计算性能.2.受 Dai 和 Wen(Applied Mathematics and Computation,2012,218(14):7421-7430.)和 Wei 等(Applied Mathematics and Computation,2006,179(2):407-430.)的启示,我们提出了一类新的求解无约束优化问题的混合共轭梯度法,即HZW法.HZW法的共轭参数βkNZW满足0 ≤ βkHZW ≤ βkFR这样的性质.并且,在每一步迭代过程中,HZW法总是能生成一个充分下降方向.在标准的Wolfe线搜索下,HZW法具有全局收敛性.此外,数值实验也表明我们的算法是有效的和可行的.
[Abstract]:Conjugate gradient method is a kind of commonly used and very effective iterative algorithm for solving large-scale unconstrained optimization problems. Compared with Newton method and quasi-Newton method, it has the advantages of simple algorithm and small storage space. As we all know, in the classical conjugate gradient method, the global convergence and numerical performance of different methods are different. Naturally, many scholars try to construct a new algorithm with good global convergence and excellent numerical performance. One idea is to directly improve the classical conjugate parameter 尾 k. Another idea is to effectively mix the conjugate parameter 尾 k, which has good convergence and good numerical performance. In this paper, we mainly consider constructing the new algorithm through the latter idea. Recently, some mixed conjugate gradient methods have been proposed and some good results have been obtained. Inspired by them, two new mixed conjugate gradient methods are proposed, their properties and global convergence are analyzed, and a large number of numerical results are given. Its main achievements are as follows: 1. By Dai and Wenzhang Applied Mathematics and Computation / 2014 2218 / 14: 7421-7430./ Jian et al. Applied Mathematical Modelling / 2015 39 / 3 / 1281-1290.) And Wei et al. Applied Mathematics and computer / 2 / 1341-1350) In this paper, the NHC method is proposed and the formula for calculating the new conjugate parameter 尾 kNHC is given. The conjugate parameter 尾 kNHC has some properties. No matter what line search strategy is used, the NHC method can generate a sufficient descent direction in each step of iteration. Moreover, under the standard Wolfe line search condition, the proposed algorithm can converge globally. Finally, we do a lot of numerical experiments. Numerical results also show that the proposed algorithm has good computational performance. 2. By Dai and Wenzhang Applied Mathematics and Computation / 2012218 / 14: 7421-7430.) And Wei et al., Applied Mathematics and Computation / 2006179 / 2: 407-430.) We propose a new class of mixed conjugate gradient method for solving unconstrained optimization problems, that is, the conjugate parameter 尾 kNZW of HZW method and HZW method satisfies the property of 0 鈮,
本文编号:2000027
本文链接:https://www.wllwen.com/kejilunwen/yysx/2000027.html