Fisher-rao metric, geometry, and complexity of neural networks

© 2019 by the author(s). We study the relationship between geometry and capacity measures for deep neural networks from an invariance viewpoint. We introduce a new notion of capacity - the Fisher-Rao norm - that possesses desirable invariance properties and is motivated by Information Geometry. We d...

Full description

Bibliographic Details
Main Authors: Liang, T (Author), Poggio, T (Author), Rakhlin, A (Author), Stokes, J (Author)
Format: Article
Language:English
Published: 2021-12-02T20:14:53Z.
Subjects:
Online Access:Get fulltext
LEADER 01365 am a22001813u 4500
001 138296
042 |a dc 
100 1 0 |a Liang, T  |e author 
700 1 0 |a Poggio, T  |e author 
700 1 0 |a Rakhlin, A  |e author 
700 1 0 |a Stokes, J  |e author 
245 0 0 |a Fisher-rao metric, geometry, and complexity of neural networks 
260 |c 2021-12-02T20:14:53Z. 
856 |z Get fulltext  |u https://hdl.handle.net/1721.1/138296 
520 |a © 2019 by the author(s). We study the relationship between geometry and capacity measures for deep neural networks from an invariance viewpoint. We introduce a new notion of capacity - the Fisher-Rao norm - that possesses desirable invariance properties and is motivated by Information Geometry. We discover an analytical characterization of the new capacity measure, through which we establish norm-comparison inequalities and further show that the new measure serves as an umbrella for several existing norm-based complexity measures. We discuss upper bounds on the generalization error induced by the proposed measure. Extensive numerical experiments on CIFAR-10 support our theoretical findings. Our theoretical analysis rests on a key structural lemma about partial derivatives of multi-layer rectifier networks. 
546 |a en 
655 7 |a Article 
773 |t AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics