A modifications of conjugate gradient method for unconstrained optimization problems

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.

     

     

  • Keywords


    Conjugate Gradient Parameter; Inexact Line Search; Strong Wolfe-Powell Line Search; Global Convergence; Unconstrained Optimization.

  • References


      [1] Al-Baali, M., “Descent property and global convergence of the Fletcher—Reeves method with inexact line search”, IMA Journal of Numerical Analysis, Vol.5, No.1, (1985), pp.121-124.

      [2] Alhawarat, A., Zabidin, S., Mustafa, M., & Mohd R., “An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties”, Optimization Methods and Software, Vol. 32, No.6, (2017), pp. 1299-1312.

      [3] Alhawarat, A., Zabidin, S., “Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line search”, Abstract and Applied Analysis, Hindawi Publishing Corporation, Vol.2017, (2017).

      [4] Alhawarat, A., Mustafa, M., Mohd R., & Zabidin, S., “An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line search”, Mathematical Problems in Engineering , Vol.2015, (2015).

      [5] Alhawarat, A., Mustafa, M., Mohd R., & Ismail, M., “A new modification of nonlinear conjugate gradient coefficients with global convergence properties”, International Journal of Mathematical, Computational, Statistical, Natural and Physical Engineering, Vol.8, No.1, (2014), pp. 54-60.

      [6] Andrei, N., “An unconstrained optimization test functions collection”, Adv. Model. Optim, Vol.10, No.1, (2008), pp. 147-161.

      [7] Dai, Y., Yaxiang, Y., “A nonlinear conjugate gradient method with a strong global convergence property”, SIAM Journal on Optimization, Vol.10, No.1, (1999), pp.177-182.

      [8] Dolan, E. D., Jorge, J. M., “Benchmarking optimization software with performance profiles”, Mathematical programming, Vol.91, No.2, (2002), pp. 201-213.

      [9] Fletcher, R., Colin, M. R., “Function minimization by conjugate gradients”, The Computer Journal, Vol.7, No.2, (1964), PP.149-154.

      [10] Fletcher, R., Practical methods of optimization, John Wiley & Sons, (2013).

      [11] Gilbert, J., Jean, C., & Jorge, N., “Global convergence properties of conjugate gradient methods for optimization”, SIAM Journal on optimization, Vol.2, No.1, (1992), pp.21-42.

      [12] Gould, N. I. M., Dominique, O., & Philippe, L. T., “CUTEr and SifDec: A constrained and unconstrained testing environment, revisited”, ACM Transactions on Mathematical Software (TOMS), Vol.29, No.4, (2003), pp.373-394.

      [13] Hestenes, M. R., Eduard, S., Methods of conjugate gradients for solving linear systems, NBS, (1952).

      [14] Liu, Y., Storey, C., “Efficient generalized conjugate gradient algorithms, part 1: theory”, Journal of Optimization Theory and Applications, Vol.69, No.1, (1991), pp.129-137.

      [15] Polak, E., Gerard, R., “Note sur la convergence de méthodes de directions conjuguées.” Revue françaised' informatiqueet de recherché pérationnelle”, Série rouge, Vol.3, No.16, (1969), pp.35-43.

      [16] Polyak, B. T., “The conjugate gradient method in extremal problems”, USSR Computational Mathematics and Mathematical Physics, Vol.9, No.4, (1969), pp.94-112.

      [17] Powell, M. J. D., “Restart procedures for the conjugate gradient method”, Mathematical programming, Vol.12, No.1, (1977), pp.241-254.

      [18] Salleh, Z., Ahmad, A., “An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property”, Journal of Inequalities and Applications, Vol.2016, No.1, (2016), pp.110.

      [19] Wolfe, P., “Convergence conditions for ascent methods”, SIAM review, Vol.11, No.2, (1969), pp.226-235.

      [20] Wei, Z., Guoyin, L., & Liqun, Q., “New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems”, Applied Mathematics and Computation, Vol.179, No.2, (2006), pp. 407-430.

      [21] Zhang, Y., Hao, Z., & Chuanlin, Z., “Global convergence of a modified PRP conjugate gradient method”, Procedia Engineering, Vol.31, (2012), pp.986-995.


 

View

Download

Article ID: 11146
 
DOI: 10.14419/ijet.v7i2.14.11146




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.