Neural Networks and the Natural Gradient
Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new fo...
Main Author: | |
---|---|
Format: | Others |
Published: |
DigitalCommons@USU
2010
|
Subjects: | |
Online Access: | https://digitalcommons.usu.edu/etd/539 https://digitalcommons.usu.edu/cgi/viewcontent.cgi?article=1535&context=etd |
Summary: | Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new formulation for the natural gradient, an algorithm is described that uses less memory and processing time than previous algorithms with comparable performance. |
---|