Improving Classification Performance of Softmax Loss Function Based on Scalable Batch-Normalization
Convolutional neural networks (CNNs) have made great achievements on computer vision tasks, especially the image classification. With the improvement of network structure and loss functions, the performance of image classification is getting higher and higher. The classic Softmax + cross-entropy los...
Main Authors: | Qiuyu Zhu, Zikuang He, Tao Zhang, Wennan Cui |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-04-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/10/8/2950 |
Similar Items
-
Combining Convolutional Neural Network and Photometric Refinement for Accurate Homography Estimation
by: Lai Kang, et al.
Published: (2019-01-01) -
Large-Margin Regularized Softmax Cross-Entropy Loss
by: Xiaoxu Li, et al.
Published: (2019-01-01) -
Improved softmax loss for deep learning-based face and expression recognition
by: Jiancan Zhou, et al.
Published: (2019-09-01) -
Batch Similarity Based Triplet Loss Assembled into Light-Weighted Convolutional Neural Networks for Medical Image Classification
by: Zhiwen Huang, et al.
Published: (2021-01-01) -
Hardware Implementation of a Softmax-Like Function for Deep Learning
by: Ioannis Kouretas, et al.
Published: (2020-08-01)