A Comparative Study of the Spectral Conjugate Gradient Methods in Regression Analysis

  • Authors

    • Norhaslinda Zull
    • Wan Khadijah
    • Ummi Khalthum
    • Mohd Yusof
    • Syazni Shoid1, Mohd Rivaie
    • Mustafa Mamat
    https://doi.org/10.14419/ijet.v7i3.28.23468
  • spectral, convergence analysis, strong Wolfe, regression.
  • In this paper, a spectral conjugate gradient (CG) method is introduced for solving the unconstrained optimization problems. This method are compared with others spectral CG coefficients. Several different dimensions for eighteen types of optimization problems are used to test the efficiency and robustness of spectral conjugate gradient methods by using Matlab subroutine programming. The method should possess the convergence analysis under strong Wolfe line search. The numerical results based on iteration number and CPU time are interpreted into performance profile by using SigmaPlot. This method will be implemented in regression analysis to validate its capability on estimating the data.

     

     

  • References

    1. [1] A. Abashar, M. Mamat, M. Rivaie, I. Mohd, Global convergence properties of a new class of conjugate gradient method for unconstrained optimization, Applied Math. Sci., 65-68 (2014) 3307-3319.

      [2] A. Abashar, M. Mamat, M. Rivaie, I. Mohd, O. Omer, The proof of sufficient descent condition for a new type of conjugate gradient methods, AIP Conference Proceedings, 1602 (2014) 296-303.

      [3] A.A. Goldstein, On steepest descent, SIAM J. Control., 3 (1965) 147-151.

      [4] D. Touti-Ahmed, C. Storey, Efficient hybrid conjugate gradient techniques, J. Optim. Theory Appl., 64 (1990) 379-397.

      [5] E. Dolan, J.J. More, Benchmarking optimization software with performance proï¬le, Math. Prog., 91 (2002) 201–213.

      [6] E. Polak, G. Ribiere, Note sur la convergence de directions conjugees, Rev. Francaise Inform. Recherche Operationelle, 3 (1969) 35–43.

      [7] E.G. Birgin, J.M. Martinez, A spectral conjugate gradient method for unconstrained optimization, J. Appl. Maths. Optimization, 43 (2001) 117-128.

      [8] G. Zoutendijk, Nonlinear programming computational methods, In J. Abadie (Ed.), Integer and nonlinear programming, Amsterdam: North Holland, pp. 37-86, 1970.

      [9] J. Barzilai, J.M Borwein, Two-point step size gradient methods, IMA Journal of Numerical Analysis, 8 (1998) 141-148.

      [10] J. Liu, Y. Jiang, Global convergence of a spectral conjugate gradient method for unconstrained optimization, Abstract and Applied Analysis, 2012, 1-12.

      [11] K.E. Hilstrom, A simulation test approach to the evaluation of nonlinear optimization algorithms, ACM. Trans. Math. Softw., 3 (1977) 305–315.

      [12] L. Armijo, Minimization of functions having Lipschitz continuous partial derivatives, Pacific J. Mathematics, 16 (1966) 1-3.

      [13] M. Raydan. The Barzilain and Borwein gradient method for the large unconstrained minimization problem, SIAM Journal on Optimization, 7 (1997) 26-33.

      [14] M. Rivaie, M. Mustafa, L.W. June, I. Mohd, A new class of nonlinear conjugate gradient coefficient with global convergence properties, Appl. Math. Comp., 218 (2012) 11323–11332.

      [15] M.R. Hestenes, E. Steifel, Method of conjugate gradient for solving linear equations, J. Res. Nat. Bur. Stand., 49 (1952) 409–436.

      [16] N. ‘Aini, N. Hajar, M. Mamat, N. Zull, M. Rivaie, Hybrid quasi-Newton and conjugate gradient method for solving unconstrained optimization problems, Journal of Engineering and Applied Sciences., 12(18) (2017) 4627-4631.

      [17] N. Andrei, An unconstrained optimization test functions collection, Adv. Modell. Optim., 10 (2008) 147–161.

      [18] N. Hajar, N. ‘Aini, N. Shapiee, Z.Z. Abidin, W. Khadijah, M. Rivaie, M. Mamat, A new modified conjugate gradient coefficient for solving system of linear equations, Journal of Physics: Conference Series, 890 (2017) 1-6.

      [19] N. Zull, M. Rivaie, M. Mamat, Z Salleh, Z. Amani, Global convergence properties of a new spectral conjugate gradient by using strong wolfe line search, Applied Math. Sci., 9 (2015) 3105-3117.

      [20] N. Zull, N. ‘Aini, S. Shoid, N.H.A. Ghani, N.S. Mohamed, M. Rivaie, M. Mamat, A conjugate gradient method with descent properties under strong Wolfe line search, Journal of Physics: Conference Series, 890 (2017) 1-6.

      [21] N.H.A. Ghani, N.S. Mohamed, N. Zull, S. Shoid, M. Rivaie, M. Mamat, Performance comparison of a new hybrid conjugate gradient method under exact and inexact line search, Journal of Physics: Conference Series, 890 (2017) 1-6.

      [22] N.Z. Abidin, M. Mamat, B. Dangerfield, J.H. Zulkepli, M.A. Baten, A. Wibowo, Combating obesity through healthy eating behavior: A call for system dynamics optimization, Plos One, 9 (2014) 1-17.

      [23] P. Wolfe, Convergence conditions for ascent method, SIAM Rev., 11 (1969) 226-235.

      [24] R. Fletcher, C. Reeves, Function minimization by conjugate gradients, Comput. J., 7 (1964) 149-154.

      [25] R. Fletcher, Practical method of optimization, John Wiley and Sons, 2013.

      [26] W. Khadijah, M. Rivaie, M. Mamat, I. Jusoh. A spectral KRMI conjugate gradient method under the strong-Wolfe line search, AIP Conference Proceedings, 1739 (2016) 1-8.

      [27] Y.H. Dai, Y. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. Oper. Res., 103 (2002) 33-47.

  • Downloads

  • How to Cite

    Zull, N., Khadijah, W., Khalthum, U., Yusof, M., Shoid1, Mohd Rivaie, S., & Mamat, M. (2018). A Comparative Study of the Spectral Conjugate Gradient Methods in Regression Analysis. International Journal of Engineering & Technology, 7(3.28), 316-320. https://doi.org/10.14419/ijet.v7i3.28.23468