A Global Universality of Two-Layer Neural Networks with ReLU Activations
In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in function spaces. There are many works that handle the convergence over compact sets. In the present paper, we consider a global convergence by introducing a nor...
Main Authors: | Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2021-01-01
|
Series: | Journal of Function Spaces |
Online Access: | http://dx.doi.org/10.1155/2021/6637220 |
Similar Items
-
Studying Perturbations on the Input of Two-Layer Neural Networks with ReLU Activation
by: Alsubaihi, Salman
Published: (2019) -
A new scheme for training ReLU-based multi-layer feedforward neural networks
by: Wang, Hao
Published: (2017) -
Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations
by: Boris Hanin
Published: (2019-10-01) -
Learning algorithm analysis for deep neural network with ReLu activation functions
by: Płaczek Stanisław, et al.
Published: (2018-01-01) -
Gaussian Perturbations in ReLU Networks and the Arrangement of Activation Regions
by: Daróczy, B.
Published: (2022)