Learning algorithm analysis for deep neural network with ReLu activation functions
In the article, emphasis is put on the modern artificial neural network structure, which in the literature is known as a deep neural network. Network includes more than one hidden layer and comprises many standard modules with ReLu nonlinear activation function. A learning algorithm includes two sta...
Main Authors: | Płaczek Stanisław, Płaczek Aleksander |
---|---|
Format: | Article |
Language: | English |
Published: |
EDP Sciences
2018-01-01
|
Series: | ITM Web of Conferences |
Online Access: | https://doi.org/10.1051/itmconf/20181901009 |
Similar Items
-
Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations
by: Boris Hanin
Published: (2019-10-01) -
Studying Perturbations on the Input of Two-Layer Neural Networks with ReLU Activation
by: Alsubaihi, Salman
Published: (2019) -
A Global Universality of Two-Layer Neural Networks with ReLU Activations
by: Naoya Hatano, et al.
Published: (2021-01-01) -
Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
by: Hock Hung Chieng, et al.
Published: (2018-07-01) -
Gaussian Perturbations in ReLU Networks and the Arrangement of Activation Regions
by: Daróczy, B.
Published: (2022)