TanhExp: A smooth activation function with high convergence speed for lightweight neural networks
Abstract Lightweight or mobile neural networks used for real‐time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, a novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed which can improve th...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-03-01
|
Series: | IET Computer Vision |
Online Access: | https://doi.org/10.1049/cvi2.12020 |
id |
doaj-f129ff1097424320aa130d3e82958c5a |
---|---|
record_format |
Article |
spelling |
doaj-f129ff1097424320aa130d3e82958c5a2021-09-30T04:50:43ZengWileyIET Computer Vision1751-96321751-96402021-03-0115213615010.1049/cvi2.12020TanhExp: A smooth activation function with high convergence speed for lightweight neural networksXinyu Liu0Xiaoguang Di1Control and Simulation Center Harbin Institute of Technology Harbin ChinaControl and Simulation Center Harbin Institute of Technology Harbin ChinaAbstract Lightweight or mobile neural networks used for real‐time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, a novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed which can improve the performance for these networks on image classification task significantly. The definition of TanhExp is f(x) = x tanh(ex). The simplicity, efficiency, and robustness of TanhExp on various datasets and network models is demonstrated and TanhExp outperforms its counterparts in both convergence speed and accuracy. Its behaviour also remains stable even with noise added and dataset altered. It is shown that without increasing the size of the network, the capacity of lightweight neural networks can be enhanced by TanhExp with only a few training epochs and no extra parameters added.https://doi.org/10.1049/cvi2.12020 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Xinyu Liu Xiaoguang Di |
spellingShingle |
Xinyu Liu Xiaoguang Di TanhExp: A smooth activation function with high convergence speed for lightweight neural networks IET Computer Vision |
author_facet |
Xinyu Liu Xiaoguang Di |
author_sort |
Xinyu Liu |
title |
TanhExp: A smooth activation function with high convergence speed for lightweight neural networks |
title_short |
TanhExp: A smooth activation function with high convergence speed for lightweight neural networks |
title_full |
TanhExp: A smooth activation function with high convergence speed for lightweight neural networks |
title_fullStr |
TanhExp: A smooth activation function with high convergence speed for lightweight neural networks |
title_full_unstemmed |
TanhExp: A smooth activation function with high convergence speed for lightweight neural networks |
title_sort |
tanhexp: a smooth activation function with high convergence speed for lightweight neural networks |
publisher |
Wiley |
series |
IET Computer Vision |
issn |
1751-9632 1751-9640 |
publishDate |
2021-03-01 |
description |
Abstract Lightweight or mobile neural networks used for real‐time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, a novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed which can improve the performance for these networks on image classification task significantly. The definition of TanhExp is f(x) = x tanh(ex). The simplicity, efficiency, and robustness of TanhExp on various datasets and network models is demonstrated and TanhExp outperforms its counterparts in both convergence speed and accuracy. Its behaviour also remains stable even with noise added and dataset altered. It is shown that without increasing the size of the network, the capacity of lightweight neural networks can be enhanced by TanhExp with only a few training epochs and no extra parameters added. |
url |
https://doi.org/10.1049/cvi2.12020 |
work_keys_str_mv |
AT xinyuliu tanhexpasmoothactivationfunctionwithhighconvergencespeedforlightweightneuralnetworks AT xiaoguangdi tanhexpasmoothactivationfunctionwithhighconvergencespeedforlightweightneuralnetworks |
_version_ |
1716863979602575360 |