Neural Networks and the Natural Gradient

Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new fo...

Full description

Bibliographic Details
Main Author: Bastian, Michael R.
Format: Others
Published: DigitalCommons@USU 2010
Subjects:
Online Access:https://digitalcommons.usu.edu/etd/539
https://digitalcommons.usu.edu/cgi/viewcontent.cgi?article=1535&context=etd
id ndltd-UTAHS-oai-digitalcommons.usu.edu-etd-1535
record_format oai_dc
spelling ndltd-UTAHS-oai-digitalcommons.usu.edu-etd-15352019-10-13T05:34:02Z Neural Networks and the Natural Gradient Bastian, Michael R. Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new formulation for the natural gradient, an algorithm is described that uses less memory and processing time than previous algorithms with comparable performance. 2010-05-01T07:00:00Z text application/pdf https://digitalcommons.usu.edu/etd/539 https://digitalcommons.usu.edu/cgi/viewcontent.cgi?article=1535&context=etd Copyright for this work is held by the author. Transmission or reproduction of materials protected by copyright beyond that allowed by fair use requires the written permission of the copyright owners. Works not in the public domain cannot be commercially exploited without permission of the copyright owner. Responsibility for any use rests exclusively with the user. For more information contact Andrew Wesolek (andrew.wesolek@usu.edu). All Graduate Theses and Dissertations DigitalCommons@USU Backpropagation Fisher Information Matrix Natural Gradient Neural Networks Newton Methods Riemannian Geometry Electrical and Computer Engineering
collection NDLTD
format Others
sources NDLTD
topic Backpropagation
Fisher Information Matrix
Natural Gradient
Neural Networks
Newton Methods
Riemannian Geometry
Electrical and Computer Engineering
spellingShingle Backpropagation
Fisher Information Matrix
Natural Gradient
Neural Networks
Newton Methods
Riemannian Geometry
Electrical and Computer Engineering
Bastian, Michael R.
Neural Networks and the Natural Gradient
description Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new formulation for the natural gradient, an algorithm is described that uses less memory and processing time than previous algorithms with comparable performance.
author Bastian, Michael R.
author_facet Bastian, Michael R.
author_sort Bastian, Michael R.
title Neural Networks and the Natural Gradient
title_short Neural Networks and the Natural Gradient
title_full Neural Networks and the Natural Gradient
title_fullStr Neural Networks and the Natural Gradient
title_full_unstemmed Neural Networks and the Natural Gradient
title_sort neural networks and the natural gradient
publisher DigitalCommons@USU
publishDate 2010
url https://digitalcommons.usu.edu/etd/539
https://digitalcommons.usu.edu/cgi/viewcontent.cgi?article=1535&context=etd
work_keys_str_mv AT bastianmichaelr neuralnetworksandthenaturalgradient
_version_ 1719265846430269440