Gaussian Perturbations in ReLU Networks and the Arrangement of Activation Regions
Recent articles indicate that deep neural networks are efficient models for various learning problems. However, they are often highly sensitive to various changes that cannot be detected by an independent observer. As our understanding of deep neural networks with traditional generalisation bounds s...
Main Author: | Daróczy, B. (Author) |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI
2022
|
Subjects: | |
Online Access: | View Fulltext in Publisher |
Similar Items
-
Studying Perturbations on the Input of Two-Layer Neural Networks with ReLU Activation
by: Alsubaihi, Salman
Published: (2019) -
ReLU Network with Bounded Width Is a Universal Approximator in View of an Approximate Identity
by: Sunghwan Moon
Published: (2021-01-01) -
Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations
by: Boris Hanin
Published: (2019-10-01) -
A new scheme for training ReLU-based multi-layer feedforward neural networks
by: Wang, Hao
Published: (2017) -
Quantum ReLU activation for Convolutional Neural Networks to improve diagnosis of Parkinson’s disease and COVID-19
by: Parisi, Luca, et al.
Published: (2021)