A Neural Basis for the Implementation of Deep Learning and Arti cial Intelligence

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    One of the mathematical cornerstones of modern data ana-lytics is machine learning whereby we automatically learn subtle patterns which may be hidden in training data, we associate those patterns with outcomes and we apply these patterns to new and unseen data and make predictions about as yet unseen outcomes. This form of data analytics al-lows us to bring value to the huge volumes of data that is collected from people, from the environment, from commerce, from online activities, from scienti c experiments, from many other sources. The mathematical basis for this form of machine learning has led to tools like Support Vector Machines which have shown moderate e ectiveness and good e ciency in their implementation. Recently, however, these have been usurped by the emergence of deep learning based on convolutional neural networks. In this presentation we will examine the basis for why such deep net-works are remarkably successful and accurate, their similarity to ways in which the human brain is organised, and the challenges of implementing such deep networks on conventional computer architectures.

     

     


  • Keywords


    Deep learning, neural computing, neural networks

  • References


      [1] G. Awad, J. Fiscus, D. Joy, M. Michel, A. F. Smeaton, W. Kraaij, G. Quenot, M. Eskevich, R. Aly, R. Ordelman, et al. Trecvid 2016: Evaluating video search, video event detection, localization, and hyperlinking. In Proceedings of TRECVID, volume 2016, 2016.

      [2] K. Cho, B. Van Merri•enboer, D. Bahdanau, and Y. Bengio. On the proper-ties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259, 2014.

      [3] H. B. Demuth, M. H. Beale, O. De Jess, and M. T. Hagan. Neural Network Design. Martin Hagan, USA, 2nd edition, 2014.

      [4] L. Deng. A tutorial survey of architectures, algorithms, and applications for deep learning. APSIPA Transactions on Signal and Information Processing, 3, 2014.

      [5] L. Deng, G. Hinton, and B. Kingsbury. New types of deep neural network learning for speech recognition and related applications: an overview. In 2013 IEEE Inter-national Conference on Acoustics, Speech and Signal Processing, pages 8599{8603, May 2013.

      [6] S. Ghosh-Dastidar and H. Adeli. Spiking neural networks. International journal of neural systems, 19(04):295{308, 2009.

      [7] N. Kalchbrenner, E. Grefenstette, and P. Blunsom. A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188, 2014.

      1. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classi cation with deep convolutional neural networks. In Advances in neural information processing sys-tems, pages 1097{1105, 2012.

      [8] Y. LeCun, L. Bottou, Y. Bengio, and P. Ha ner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278{2324, 1998.

      [9] W. Maass and C. M. Bishop. Pulsed neural networks. MIT press, 2001.

      [10] Sutskever, O. Vinyals, and Q. V. Le. Sequence to sequence learning with neural networks. In Advances in neural information processing systems, pages 3104{3112, 2014.

      [11] Z. Wang, Y. Ma, F. Cheng, and L. Yang. Review of pulse-coupled neural networks. Image and Vision Computing, 28(1):5 { 13, 2010.

      [12] R. J. Williams and D. Zipser. A learning algorithm for continually running fully recurrent neural networks. Neural computation, 1(2):270{280, 1989.

      [13] C. Zhang, P. Li, G. Sun, Y. Guan, B. Xiao, and J. Cong. Optimizing FPGA-based accelerator design for deep convolutional neural networks. In Proceedings of the 2015 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays, pages 161{170. ACM, 2015.


 

View

Download

Article ID: 23913
 
DOI: 10.14419/ijet.v7i4.36.23913




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.