A Review on the Concept of Deep Learning

Authors

  • Shalini Bhaskar Bajaj Professor, Department of Computer Science and Engineering, Amity University Haryana, Gurugram, India, Author

Keywords:

Artificial neural networks, deep learning, fee forward network

Abstract

Artificial Neural Networks (ANN) has a number of application areas ranging from economic analysis to image processing and recognition. ANN is used by many online stores in the form of recommenda tion systems to offer suitable products based on their liking. Not only this, artificial neural networks are used these days for routing and navigation systems as in case of unmanned vehicles, antivirus softwares, etc. In this work based on artificial neural network recurrent and convolutional network is proposed at the level of letters in order to classify and sort textual information with giv en classes. 

Downloads

Download data is not yet available.

References

Kaiming He ; Xiangyu Zhang ; Shaoqing Ren ; Jian Sun, Deep Residual Learning for Image Recogni tion, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, 2016.

n re ren er ane e er and Andrej Kos, Intro duction to the Artificial Neural Networks , Artificial Neural Networks - Methodological Advances and Biomedical Applications

Zhongqiang Zhang, Stochastic Processes, Springer, 2016, MA 529

N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutkever, R. Salakhutdinov, Droupout: A simple way to pre vent neural networks from overfitting, Journal of Machine Learning Research, vol. 15, (2014), pp. 1929-1958

Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean, Efficient Estimation of Word Representations in Vector Space, Computation and Language, Cor nell University, arXiv:1301.3781

Yoon Kim, Convolutional Neural Networks for Sen tence Classification, Proceedings of the 2014 Con ference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar

Rafal Jozefowicz, Oriol Vinyals, Mike Schuster, No am Shazeer, Yonghui Wu, Exploring the Limits of Language Modeling, Computation and Language, Cornell University, arXiv:1602.02410

Klaus Greff, Rupesh Kumar Srivastava, Jan Koutník, Bas R. Steunebrink, Jürgen Schmidhuber, LSTM: A Search Space Odyssey, Neural and Evolutionary Computing, Cornell University, arXiv:1503.04069

rent Convolutional Neural Networks for Text Classi fication, Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, 2015, pp. 2267-2273

Jiaming Xu, Peng Wang, Guanhua Tian, Bo Xu, Jun Zhao, Fangyuan Wang, Hongwei Hao, Short Text Clustering via Convolutional Neural Networks, Pro ceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, June 2015, pp. 62-69

Yin Zhang, Rong Jin, Zhi-Hua Zhou, Understanding bag-of-words model: a statistical framework, Inter national Journal of Machine Learning and Cybernet ics volume 1, pages43–52(2010)

Yann LeCun Leon Bottou Yoshua Bengio and Pat rick Haner, GradientBased Learning Applied to Document Recognition, Proceedings of the IEEE, November 1998

X. Chen ; X. Liu ; M. J. F. Gales ; P. C. Woodland, Recurrent neural network language model training with noise contrastive estimation for speech recogni tion, 2015 IEEE International Conference on Acous tics, Speech and Signal Processing (ICASSP), Brisbane, QLD, Australia

Matthew D. Zeiler, ADADELTA: An Adaptive Learning Rate Method, Machine Learning (2012), Cornell University, arXiv:1212.5701

Dumitru Erhan, Pierre-Antoine Manzagol, Yoshua Bengio, Samy Bengio and Pascal Vincent, The Dif ficulty of Training Deep Architectures and the Ef-

fect of Unsupervised Pre-Training, Proceedings of the 12th International Conference on Artificial Intel ligence and Statistics (AISTATS) 2009, Clearwater Beach, Florida, USA. Volume 5 of JMLR: W&CP 5

Pierre Baldi, Kurt Hornik, Neural Networks and Principal Component Analysis: Learning from Ex amples Without Local Minima, Neural Networks, Vol. 2, pp. 53-58, 1989

Sergey Ioffe, Christian Szegedy, Batch Normaliza tion: Accelerating Deep Network Training by Re ducing Internal Covariate Shift, Proceedings of the 32 nd International Conference on Machine Learn ing, Lille, France, 2015. JMLR: W&CP volume 37

Yadav, Neha, Yadav, Anupam, Kumar, Manoj, An Introduction to Neural Network Methods for Differ ential Equations, SpringerBriefs in Computational Intelligence, Springer, 2015

I. V. Zaentsev, Neural Network: Basic Models [20] https://cs231n.github.io/

http://www.aiportal.ru/articles/neural-networks/ decision- xor.html

https://malaikannan.wordpress.com/2016/09/13/cros s-entropy/

https://en.wikipedia.org/wiki/Overfitting [24] https://habr.com/en/post/175819/

http://www.nanonewsnet.ru/articles/2016/kak obuchae tsya-ii

https://geektimes.ru/post/74326/

http://deeplearning.net/tutorial/lenet.html [28] http://www.360doc.com/content/16/0303/19/ 2459_53 9162206.shtml

http://colah.github.io/posts/2014-07-Understanding C onvolutions/

https://www.google.com/url?sa=i&url=https%3A% 2F %2 Fautomaticaddison.com%2Fartificial feedforward-neural-network-with-backpropagation from-scratch%2 F&psig=AOvVaw2jvJzDQ fHIedy5qd-IgRz&us t=1 590387947314000& source=images &cd=vfe& ved=0 CAQQtaYDahcKEwjYhorr9MvpAhUAAAAAHQ AA A A AQEA

https://www.google.com/url?sa=i&url=https%3A% 2F %2Fwww.researchgate.net%2Ffigure%2F15x15- Pixel -Labelled-Sample-Images-After-Image Creation-Phase

_fig4_324802031&psig=AOvVaw3_PZaA1ZZ7sII1 cOssimWn&ust=1590388321832000&source=imag es&cd=vfe&ved=0CAMQjB1qFwoTCMDSldDwy kCFQ AAAAAdAAAAABAR

Downloads

Published

2020-05-05

How to Cite

A Review on the Concept of Deep Learning . (2020). International Journal of Innovative Research in Computer Science & Technology, 8(3), 126–130. Retrieved from https://acspublisher.com/journals/index.php/ijircst/article/view/13280