.

ISSN 2063-5346
For urgent queries please contact : +918130348310

COMPARATIVE ANALYSIS OF VARIOUS ACTIVATION FUNCTIONS IN CONVOLUTIONAL NEURAL NETWORK FOR SIGN LANGUAGE RECOGNITION

Main Article Content

Sornam M1*, Panneer Selvam E
» doi: 10.48047/ecb/2023.12.si5a.0305

Abstract

Communication is the act of sharing and receiving information where the modality of spoken language is oral-auditory and visual-gesture for sign language that makes huge difference in interpretation of sign languages. In most of the cases convolutional neural network model is used to recognize the sign language where the activation function is the fundamental component of the hierarchical structure of the CNN model due to its nonlinear properties. On the basis of the Keras framework, seven common activation functions namely, ReLu, LeakyReLu, PReLu, ReLu6, SELU, Swish, HardSwish, and Mish have been analyzed and evaluated in sign language recognition tasks. This work evaluates the performance of Convolutional Neural Network (CNN) with different activation functions and the experimental results shows that CNN based on mish activation function has a phenomenal improvement in performance than the other activation functions.

Article Details