Generalization and properties of the neural response
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010. === This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. === Cataloged from student-submi...
Main Author: | |
---|---|
Other Authors: | |
Format: | Others |
Language: | English |
Published: |
Massachusetts Institute of Technology
2011
|
Subjects: | |
Online Access: | http://hdl.handle.net/1721.1/61517 |
id |
ndltd-MIT-oai-dspace.mit.edu-1721.1-61517 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-MIT-oai-dspace.mit.edu-1721.1-615172019-05-02T16:37:52Z Generalization and properties of the neural response Wibisono, Andre Yohannes Tomaso Poggio and Lorenzo Rosasco. Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. Electrical Engineering and Computer Science. Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010. This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. Cataloged from student-submitted PDF version of thesis. Includes bibliographical references (p. 79-83). Hierarchical learning algorithms have enjoyed tremendous growth in recent years, with many new algorithms being proposed and applied to a wide range of applications. However, despite the apparent success of hierarchical algorithms in practice, the theory of hierarchical architectures remains at an early stage. In this thesis we study the theoretical properties of hierarchical algorithms from a mathematical perspective. Our work is based on the framework of hierarchical architectures introduced by Smale et al. in the paper "Mathematics of the Neural Response", Foundations of Computational Mathematics, 2010. We propose a generalized definition of the neural response and derived kernel that allows us to integrate some of the existing hierarchical algorithms in practice into our framework. We then use this generalized definition to analyze the theoretical properties of hierarchical architectures. Our analysis focuses on three particular aspects of the hierarchy. First, we show that a wide class of architectures suffers from range compression; essentially, the derived kernel becomes increasingly saturated at each layer. Second, we show that the complexity of a linear architecture is constrained by the complexity of the first layer, and in some cases the architecture collapses into a single-layer linear computation. Finally, we characterize the discrimination and invariance properties of the derived kernel in the case when the input data are one-dimensional strings. We believe that these theoretical results will provide a useful foundation for guiding future developments within the theory of hierarchical algorithms. by Andre Yohannes Wibisono. M.Eng. 2011-03-07T14:38:26Z 2011-03-07T14:38:26Z 2010 2010 Thesis http://hdl.handle.net/1721.1/61517 703286375 eng M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582 83 p. application/pdf Massachusetts Institute of Technology |
collection |
NDLTD |
language |
English |
format |
Others
|
sources |
NDLTD |
topic |
Electrical Engineering and Computer Science. |
spellingShingle |
Electrical Engineering and Computer Science. Wibisono, Andre Yohannes Generalization and properties of the neural response |
description |
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010. === This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. === Cataloged from student-submitted PDF version of thesis. === Includes bibliographical references (p. 79-83). === Hierarchical learning algorithms have enjoyed tremendous growth in recent years, with many new algorithms being proposed and applied to a wide range of applications. However, despite the apparent success of hierarchical algorithms in practice, the theory of hierarchical architectures remains at an early stage. In this thesis we study the theoretical properties of hierarchical algorithms from a mathematical perspective. Our work is based on the framework of hierarchical architectures introduced by Smale et al. in the paper "Mathematics of the Neural Response", Foundations of Computational Mathematics, 2010. We propose a generalized definition of the neural response and derived kernel that allows us to integrate some of the existing hierarchical algorithms in practice into our framework. We then use this generalized definition to analyze the theoretical properties of hierarchical architectures. Our analysis focuses on three particular aspects of the hierarchy. First, we show that a wide class of architectures suffers from range compression; essentially, the derived kernel becomes increasingly saturated at each layer. Second, we show that the complexity of a linear architecture is constrained by the complexity of the first layer, and in some cases the architecture collapses into a single-layer linear computation. Finally, we characterize the discrimination and invariance properties of the derived kernel in the case when the input data are one-dimensional strings. We believe that these theoretical results will provide a useful foundation for guiding future developments within the theory of hierarchical algorithms. === by Andre Yohannes Wibisono. === M.Eng. |
author2 |
Tomaso Poggio and Lorenzo Rosasco. |
author_facet |
Tomaso Poggio and Lorenzo Rosasco. Wibisono, Andre Yohannes |
author |
Wibisono, Andre Yohannes |
author_sort |
Wibisono, Andre Yohannes |
title |
Generalization and properties of the neural response |
title_short |
Generalization and properties of the neural response |
title_full |
Generalization and properties of the neural response |
title_fullStr |
Generalization and properties of the neural response |
title_full_unstemmed |
Generalization and properties of the neural response |
title_sort |
generalization and properties of the neural response |
publisher |
Massachusetts Institute of Technology |
publishDate |
2011 |
url |
http://hdl.handle.net/1721.1/61517 |
work_keys_str_mv |
AT wibisonoandreyohannes generalizationandpropertiesoftheneuralresponse |
_version_ |
1719044391088160768 |