A comparative analysis of various activation functions and optimizers in a convolutional neural network for hyperspectral image classification
Citation
Seyrek, E.C., Uysal, M. A comparative analysis of various activation functions and optimizers in a convolutional neural network for hyperspectral image classification. Multimed Tools Appl 83, 53785–53816 (2024). https://doi.org/10.1007/s11042-023-17546-5Abstract
Hyperspectral imaging has a strong capability respecting distinguishing surface objects due to the ability of collect hundreds of bands along the electromagnetic spectrum. Hyperspectral image classification, one of the major tasks of hyperspectral image processing, is challenging process due to the characteristics of the considered dataset. Along with a variety of traditional algorithms, the convolutional neural network (CNN) has gained popularity in recent days thanks to its excellent performance. Activation functions and optimizers have a crucial role in learning process of CNN model. In this paper, a comparative analysis using a set of different activation functions and optimizers was performed. For this purpose, six different activation functions, LReLU, Mish, PReLU, ReLU, Sigmoid, and Swish, and four different optimizers, Adam, Adamax, Nadam, and RMSProp, were utilized on a CNN model. Two publicly available datasets, named Indian Pines and WHU-Hi HongHu, were used in the experiments. According to the results, the CNN model using Adamax optimizer and Mish activation function had the best overall accuracies for the Indian Pines WHU-Hi HongHu dataset at 98.32% and 97.54%, respectively.
Source
Multimedia Tools and ApplicationsVolume
83Collections
- Makaleler [10]