Generic placeholder image

Recent Patents on Computer Science

Editor-in-Chief

ISSN (Print): 2213-2759
ISSN (Online): 1874-4796

Research Article

Analysis of Non-Linear Activation Functions for Classification Tasks Using Convolutional Neural Networks

Author(s): Aman Dureja* and Payal Pahwa

Volume 12, Issue 3, 2019

Page: [156 - 161] Pages: 6

DOI: 10.2174/2213275911666181025143029

Price: $65

Abstract

Background: In making the deep neural network, activation functions play an important role. But the choice of activation functions also affects the network in term of optimization and to retrieve the better results. Several activation functions have been introduced in machine learning for many practical applications. But which activation function should use at hidden layer of deep neural networks was not identified.

Objective: The primary objective of this analysis was to describe which activation function must be used at hidden layers for deep neural networks to solve complex non-linear problems.

Methods: The configuration for this comparative model was used by using the datasets of 2 classes (Cat/Dog). The number of Convolutional layer used in this network was 3 and the pooling layer was also introduced after each layer of CNN layer. The total of the dataset was divided into the two parts. The first 8000 images were mainly used for training the network and the next 2000 images were used for testing the network.

Results: The experimental comparison was done by analyzing the network by taking different activation functions on each layer of CNN network. The validation error and accuracy on Cat/Dog dataset were analyzed using activation functions (ReLU, Tanh, Selu, PRelu, Elu) at number of hidden layers. Overall the Relu gave best performance with the validation loss at 25th Epoch 0.3912 and validation accuracy at 25th Epoch 0.8320.

Conclusion: It is found that a CNN model with ReLU hidden layers (3 hidden layers here) gives best results and improve overall performance better in term of accuracy and speed. These advantages of ReLU in CNN at number of hidden layers are helpful to effectively and fast retrieval of images from the databases.

Keywords: CNN, activation function, hidden layers, deep neural networks, non-linear problems, machine learning.

Graphical Abstract
[1]
K. Alex, "ImageNet Classification with Deep Convolutional Neural Networks", NIPS'12 Proceedings of the 25th International Conference on Neural Information Processing Systems, Nevada, USA, 2012, pp. 1097-1105.
[2]
C. Djork-Arné, U. Thomas, and H. Sepp, "Fast and accurate deep network learning by exponential linear units (elus)", CoRR, abs/1511.07289, 2015.
[3]
K. Günter, U. Thomas, M. Andreas, and H. Sepp, "Self-normalizing neural networks", CoRR, abs/1706.02515, 2017.
[4]
A. Raman, B. Amitabh, M. Poorya and Mukherjee, “Anirbit: Understanding deep neural networks with rectified linear units”, CoRR, abs/1611.01491, 2016.
[5]
K. He, X. Zhang, S. Ren, and J. Sun, "Delving deep into rectifiers: Surpassing human-level performance on imagenet classification", In Proceedings of the IEEE international conference on computer vision Washington, DC, USA 2012, pp. 1026-1034.
[6]
G. Xavier, and B. Yoshua, "Understanding the difficulty of training deep feed forward neural networks", Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS’10). Society for Artificial Intelligence and Statistics, Sardinia, Italy, 2010, pp. 249-256.
[7]
X. Bing, W. Naiyan, C. Tianqi, and L. Mu, "Empirical evaluation of rectified activations in convolutional network", CoRR, abs/1505.00853, 2015.
[8]
K. Alex, and H. Geoffrey, "Learning multiple layers of features from tiny images", Technical Report, vol. 1, p. 7, 2009.
[9]
D. Aman, and P. Payal, "Image retrieval techniques: A survey", Int. J. Eng. Technol., vol. 7, p. 215, 2017.

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy