Universiti Teknologi Malaysia Institutional Repository

Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems

Liew, S. S. and Khalil-Hani, M. and Bakhteri, R. (2016) Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems. Neurocomputing, 216 . pp. 718-734. ISSN 0925-2312

Full text not available from this repository.

Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

This paper focuses on the enhancement of the generalization ability and training stability of deep neural networks (DNNs). New activation functions that we call bounded rectified linear unit (ReLU), bounded leaky ReLU, and bounded bi-firing are proposed. These activation functions are defined based on the desired properties of the universal approximation theorem (UAT). An additional work on providing a new set of coefficient values for the scaled hyperbolic tangent function is also presented. These works result in improved classification performances and training stability in DNNs. Experimental works using the multilayer perceptron (MLP) and convolutional neural network (CNN) models have shown that the proposed activation functions outperforms their respective original forms in regards to the classification accuracies and numerical stability. Tests on MNIST, mnist-rot-bg-img handwritten digit, and AR Purdue face databases show that significant improvements of 17.31%, 9.19%, and 74.99% can be achieved in terms of the testing misclassification error rates (MCRs), applying both mean squared error (MSE) and cross-entropy (CE) loss functions This is done without sacrificing the computational efficiency. With the MNIST dataset, bounding the output of an activation function results in a 78.58% reduction in numerical instability, and with the mnist-rot-bg-img and AR Purdue databases the problem is completely eliminated. Thus, this work has demonstrated the significance of bounding an activation function in helping to alleviate the training instability problem when training a DNN model (particularly CNN).

Item Type:Article
Uncontrolled Keywords:Chemical activation, Computation theory, Computational efficiency, Convergence of numerical methods, Convolution, Mean square error, Neural networks, Pattern recognition, Stability, Activation functions, Convolutional neural network, Deep neural networks, Generalization performance, Output boundary, Hyperbolic functions, algorithm, Article, controlled study, data base, delay discounting, facial expression, human, learning, learning algorithm, nerve cell network, pattern recognition, perceptron, priority journal, probability, task performance, training, visual discrimination
Subjects:T Technology > TK Electrical engineering. Electronics Nuclear engineering
Divisions:Electrical Engineering
ID Code:71526
Deposited By: Fazli Masari
Deposited On:14 Nov 2017 07:00
Last Modified:14 Nov 2017 07:00

Repository Staff Only: item control page