Neural networks. A general framework for non-linear function approximation

The focus of this paper is on the neural network modelling approach that has gained increasing recognition in GIScience in recent years. The novelty about neural networks lies in their ability to model non-linear processes with few, if any, a priori assumptions about the nature of the data-generatin...

Full description

Bibliographic Details
Main Author: Fischer, Manfred M.
Format: Others
Language:de
Published: Wiley-Blackwell 2006
Online Access:http://epub.wu.ac.at/5493/1/NeuralNetworks.pdf
http://dx.doi.org/10.1111/j.1467-9671.2006.01010.x
id ndltd-VIENNA-oai-epub.wu-wien.ac.at-5493
record_format oai_dc
spelling ndltd-VIENNA-oai-epub.wu-wien.ac.at-54932017-03-25T05:42:07Z Neural networks. A general framework for non-linear function approximation Fischer, Manfred M. The focus of this paper is on the neural network modelling approach that has gained increasing recognition in GIScience in recent years. The novelty about neural networks lies in their ability to model non-linear processes with few, if any, a priori assumptions about the nature of the data-generating process. The paper discusses some important issues that are central for successful application development. The scope is limited to feedforward neural networks, the leading example of neural networks. It is argued that failures in applications can usually be attributed to inadequate learning and/or inadequate complexity of the network model. Parameter estimation and a suitably chosen number of hidden units are, thus, of crucial importance for the success of real world neural network applications. The paper views network learning as an optimization problem, reviews two alternative approaches to network learning, and provides insights into current best practice to optimize complexity so to perform well on generalization tasks. Wiley-Blackwell 2006 Article PeerReviewed de application/pdf http://epub.wu.ac.at/5493/1/NeuralNetworks.pdf http://dx.doi.org/10.1111/j.1467-9671.2006.01010.x http://onlinelibrary.wiley.com/ http://dx.doi.org/10.1111/j.1467-9671.2006.01010.x http://epub.wu.ac.at/5493/
collection NDLTD
language de
format Others
sources NDLTD
description The focus of this paper is on the neural network modelling approach that has gained increasing recognition in GIScience in recent years. The novelty about neural networks lies in their ability to model non-linear processes with few, if any, a priori assumptions about the nature of the data-generating process. The paper discusses some important issues that are central for successful application development. The scope is limited to feedforward neural networks, the leading example of neural networks. It is argued that failures in applications can usually be attributed to inadequate learning and/or inadequate complexity of the network model. Parameter estimation and a suitably chosen number of hidden units are, thus, of crucial importance for the success of real world neural network applications. The paper views network learning as an optimization problem, reviews two alternative approaches to network learning, and provides insights into current best practice to optimize complexity so to perform well on generalization tasks.
author Fischer, Manfred M.
spellingShingle Fischer, Manfred M.
Neural networks. A general framework for non-linear function approximation
author_facet Fischer, Manfred M.
author_sort Fischer, Manfred M.
title Neural networks. A general framework for non-linear function approximation
title_short Neural networks. A general framework for non-linear function approximation
title_full Neural networks. A general framework for non-linear function approximation
title_fullStr Neural networks. A general framework for non-linear function approximation
title_full_unstemmed Neural networks. A general framework for non-linear function approximation
title_sort neural networks. a general framework for non-linear function approximation
publisher Wiley-Blackwell
publishDate 2006
url http://epub.wu.ac.at/5493/1/NeuralNetworks.pdf
http://dx.doi.org/10.1111/j.1467-9671.2006.01010.x
work_keys_str_mv AT fischermanfredm neuralnetworksageneralframeworkfornonlinearfunctionapproximation
_version_ 1718434764723060736