Academic Journals Database
Disseminating quality controlled scientific knowledge

Two-Versions of Conjugate Gradient-Algorithms Based on Conjugacy Conditions for Unconstrained Optimization

ADD TO MY LIST
 
Author(s): Abbas Y. AL-Bayati | A. J. Salim | Khalel K. Abbo

Journal: American Journal of Economics and Business Administration
ISSN 1945-5488

Volume: 1;
Issue: 2;
Start page: 97;
Date: 2009;
Original page

Keywords: (CG) algorithms | exact line searches | global convergence properties

ABSTRACT
Problem statement: (CG) algorithms, which we had investigated in this study, were widely used in optimization, especially for large scale optimization problems, because it did not need the storage of any matrix. The purpose of this construction was to find new CG-algorithms suitable for solving large scale optimization problems. Approach: Based on pure conjugacy condition and quadratic convex function two new versions of (CG) algorithms were derived and observed that they were generate descent directions for each iteration, the global convergence analysis of these algorithms with Wolfe line search conditions had been proved. Results: Numerical results for some standard test functions were reported and compared with the classical Fletcher-Reeves and Hestenes-Stiefel algorithms showing considerable improving over these standard CG-algorithms. Conclusion: Two new versions of CG-algorithms were proposed in this study with their numerical properties and convergence analysis and they were out perform on the standard HS and FR CG-algorithms.
Why do you need a reservation system?      Affiliate Program