首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 46 毫秒
1.
基于利用修正HS方法提高算法效率和利用DY方法保证算法的全局收敛性等思想,分别在不同条件下提出两种新的混合共轭梯度法求解大规模无约束优化问题.在一般Wlolfe线搜索下不需给定下降条件,证明了两个算法的全局收敛性,数值实验表明所提出算法的有效性,特别对于某些大规模无约束优化问题,数值表现较好.  相似文献   

2.
本文提出了一种求解无约束优化问题的修正PRP共轭梯度法.算法采用一个新的公式计算参数,避免了产生较小的步长.在适当的条件下,证明了算法具有下降性质,并且在采用强Wolfe线搜索时,算法是全局收敛的.最后,给出了初步的数值试验结果.  相似文献   

3.
在LS方法基础上,提出了一种新的求解无约束最优化问题的共轭梯度法.新方法通过一个新的公式计算参数,克服了LS方法的数值效果不稳定和收敛性弱的缺点,并且在强Wolfe线搜索下证明了该方法具有充分下降性和全局收敛性.大量的数值试验表明新方法是稳定的、有效的.  相似文献   

4.
本文在校正的DFP方法基础上,提出了一个新的三项梯度下降算法.该算法能够保证在每一步迭代中具有充分下降性,并在强Wolfe线搜索条件下对一般函数具有全局收敛性.数值试验表明它对给定的问题是非常有效的、稳定的.  相似文献   

5.
基于共轭和下降性质,提出了一种强迫下降的三项共轭梯度法,证明了算法在Wolfe线搜索下的全局收敛性,并进行了数值比较实验.理论与数值试验结果表明这个算法是一个值得研究的方法.  相似文献   

6.
本文对无约束优化问题提出了一种新的非标准共轭梯度算法;该算法的搜索方向类似于曲线搜索算法的方向。证明了新算法的全局收敛性,并通过数值模拟验证了该算法是有效的和快速的。  相似文献   

7.
本文通过结合MFR方法与MDY方法,对搜索方向进行调整,提出了一类求解无约束优化问题的修正DY共轭梯度法,该法在每步迭代都能不依赖于任何搜索而自行产生充分下降方向.在适当的条件下,证明了在Armijo搜索下对于非凸的优化问题,本文算法是全局收敛的.数值实验表明本文算法是有效的.  相似文献   

8.
在DY共轭梯度法的基础上对解决无约束最优化问题提出一种改进的共轭梯度法.该方法在标准wolfe线搜索下具有充分下降性,且算法全局收敛.数值结果表明了该算法的有效性.最后将算法用于SO2氧化反应动力学模型的非线性参数估计,获得满意效果.  相似文献   

9.
一种基于梯度搜索的全局优化新算法   总被引:2,自引:0,他引:2       下载免费PDF全文
本文以神经网络为背景,提出了一种以“惯性搜索”为核心的全局优化新算法,在证明了算法的稳定性、最优性及可行性之后,给出了算法的具体步骤和电路实现模型,最后的仿真结果也表明,本算法能够克服梯度下降法停留在局部极小值的缺点。  相似文献   

10.
一类修正的DY 共轭梯度法及其全局收敛性   总被引:2,自引:0,他引:2  
本文提出了一类求解无约束优化问题的修正DY共轭梯度法.算法采用新的迭代格式,每步迭代都可自行产生一个充分下降方向.采用Wolfe线搜索时,证明了全局收敛性.数值实验结果验证了算法是有效的.  相似文献   

11.
《国际计算机数学杂志》2012,89(16):3436-3447
Sufficient descent condition is very crucial in establishing the global convergence of nonlinear conjugate gradient method. In this paper, we modified two conjugate gradient methods such that both methods satisfy this property. Under suitable conditions, we prove the global convergence of the proposed methods. Numerical results show that the proposed methods are efficient for the given test problems.  相似文献   

12.
13.
    
We propose a new optimization problem which combines the good features of the classical conjugate gradient method using some penalty parameter, and then, solve it to introduce a new scaled conjugate gradient method for solving unconstrained problems. The method reduces to the classical conjugate gradient algorithm under common assumptions, and inherits its good properties. We prove the global convergence of the method using suitable conditions. Numerical results show that the new method is efficient and robust.  相似文献   

14.
    
Following Andrei's approach of combining the conjugate gradient parameters convexly, a hybridization of the Hestenes–Stiefel (HS) and Dai–Yuan conjugate gradient (CG) methods is proposed. The hybridization parameter is computed by solving the least-squares problem of minimizing the distance between search directions of the hybrid method and a three-term conjugate gradient method proposed by Zhang et al. which possesses the sufficient descent property. Also, Powell's non-negative restriction of the HS CG parameter is employed in the hybrid method. A brief global convergence analysis is made without convexity assumption on the objective function. Comparative testing results are reported; they demonstrate efficiency of the proposed hybrid CG method in the sense of the Dolan–Moré performance profile.  相似文献   

15.
The conjugate gradient (CG) method is one of the most popular methods for solving large-scale unconstrained optimization problems. In this paper, a new modified version of the CG formula that was introduced by Polak, Ribière, and Polyak is proposed for problems that are bounded below and have a Lipschitz-continuous gradient. The new parameter provides global convergence properties when the strong Wolfe-Powell (SWP) line search or the weak Wolfe-Powell (WWP) line search is employed. A proof of a sufficient descent condition is provided for the SWP line search. Numerical comparisons between the proposed parameter and other recent CG modifications are made on a set of standard unconstrained optimization problems. The numerical results demonstrate the efficiency of the proposed CG parameter compared with the other CG parameters.  相似文献   

16.
    
In this paper, a DL-type conjugate gradient method is presented. The given method is a modification of the Dai–Liao conjugate gradient method. It can also be considered as a modified LS conjugate gradient method. For general objective functions, the proposed method possesses the sufficient descent condition under the Wolfe line search and is globally convergent. Numerical comparisons show that the proposed algorithm slightly outperforms the PRP+ and CG-descent gradient algorithms as well as the Barzilai–Borwein gradient algorithm.  相似文献   

17.
In this paper, based on the numerical efficiency of Hestenes–Stiefel (HS) method, a new modified HS algorithm is proposed for unconstrained optimization. The new direction independent of the line search satisfies in the sufficient descent condition. Motivated by theoretical and numerical features of three-term conjugate gradient (CG) methods proposed by Narushima et al., similar to Dai and Kou approach, the new direction is computed by minimizing the distance between the CG direction and the direction of the three-term CG methods proposed by Narushima et al. Under some mild conditions, we establish global convergence of the new method for general functions when the standard Wolfe line search is used. Numerical experiments on some test problems from the CUTEst collection are given to show the efficiency of the proposed method.  相似文献   

18.
    
In this paper, a modified Polak–Ribi‘ere–Polyak (MPRP) conjugate gradient method for smooth unconstrained optimization is proposed. This method produces at each iteration a descent direction, and this property is independent of the line search adopted. Under standard assumptions, we prove that the MPRP method using strong Wolfe conditions is globally convergent. The results of computational experiments are reported and show the effectiveness of the proposed method.  相似文献   

19.
    
The spectral conjugate gradient methods, with simple construction and nice numerical performance, are a kind of effective methods for solving large-scale unconstrained optimization problems. In this paper, based on quasi-Newton direction and quasi-Newton condition, and motivated by the idea of spectral conjugate gradient method as well as Dai-Kou's selecting technique for conjugate parameter [SIAM J. Optim. 23 (2013), pp. 296–320], a new approach for generating spectral parameters is presented, where a new double-truncating technique, which can ensure both the sufficient descent property of the search directions and the bounded property of the sequence of spectral parameters, is introduced. Then a new associated spectral conjugate gradient method for large-scale unconstrained optimization is proposed. Under either the strong Wolfe line search or the generalized Wolfe line search, the proposed method is always globally convergent. Finally, a large number of comparison numerical experiments on large-scale instances from one thousand to two million variables are reported. The numerical results show that the proposed method is more promising.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号