Construction of Neural Networks for Realization of Localized Deep Learning
The subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time serie...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2018-05-01
|
Series: | Frontiers in Applied Mathematics and Statistics |
Subjects: | |
Online Access: | http://journal.frontiersin.org/article/10.3389/fams.2018.00014/full |
id |
doaj-9c904df4434d463dbbd3a03f4d617718 |
---|---|
record_format |
Article |
spelling |
doaj-9c904df4434d463dbbd3a03f4d6177182020-11-25T01:37:45ZengFrontiers Media S.A.Frontiers in Applied Mathematics and Statistics2297-46872018-05-01410.3389/fams.2018.00014357280Construction of Neural Networks for Realization of Localized Deep LearningCharles K. Chui0Charles K. Chui1Shao-Bo Lin2Ding-Xuan Zhou3Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong KongDepartment of Statistics, Stanford University, Stanford, CA, United StatesDepartment of Mathematics, Wenzhou University, Wenzhou, ChinaDepartment of Mathematics, City University of Hong Kong, Kowloon, Hong KongThe subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time series forecasting, and search engines. However, theoretical development of deep learning is still at its infancy. The objective of this paper is to introduce a deep neural network (also called deep-net) approach to localized manifold learning, with each hidden layer endowed with a specific learning task. For the purpose of illustrations, we only focus on deep-nets with three hidden layers, with the first layer for dimensionality reduction, the second layer for bias reduction, and the third layer for variance reduction. A feedback component is also designed to deal with outliers. The main theoretical result in this paper is the order O(m-2s/(2s+d)) of approximation of the regression function with regularity s, in terms of the number m of sample points, where the (unknown) manifold dimension d replaces the dimension D of the sampling (Euclidean) space for shallow nets.http://journal.frontiersin.org/article/10.3389/fams.2018.00014/fulldeep netslearning theorydeep learningmanifold learningfeedback |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Charles K. Chui Charles K. Chui Shao-Bo Lin Ding-Xuan Zhou |
spellingShingle |
Charles K. Chui Charles K. Chui Shao-Bo Lin Ding-Xuan Zhou Construction of Neural Networks for Realization of Localized Deep Learning Frontiers in Applied Mathematics and Statistics deep nets learning theory deep learning manifold learning feedback |
author_facet |
Charles K. Chui Charles K. Chui Shao-Bo Lin Ding-Xuan Zhou |
author_sort |
Charles K. Chui |
title |
Construction of Neural Networks for Realization of Localized Deep Learning |
title_short |
Construction of Neural Networks for Realization of Localized Deep Learning |
title_full |
Construction of Neural Networks for Realization of Localized Deep Learning |
title_fullStr |
Construction of Neural Networks for Realization of Localized Deep Learning |
title_full_unstemmed |
Construction of Neural Networks for Realization of Localized Deep Learning |
title_sort |
construction of neural networks for realization of localized deep learning |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Applied Mathematics and Statistics |
issn |
2297-4687 |
publishDate |
2018-05-01 |
description |
The subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time series forecasting, and search engines. However, theoretical development of deep learning is still at its infancy. The objective of this paper is to introduce a deep neural network (also called deep-net) approach to localized manifold learning, with each hidden layer endowed with a specific learning task. For the purpose of illustrations, we only focus on deep-nets with three hidden layers, with the first layer for dimensionality reduction, the second layer for bias reduction, and the third layer for variance reduction. A feedback component is also designed to deal with outliers. The main theoretical result in this paper is the order O(m-2s/(2s+d)) of approximation of the regression function with regularity s, in terms of the number m of sample points, where the (unknown) manifold dimension d replaces the dimension D of the sampling (Euclidean) space for shallow nets. |
topic |
deep nets learning theory deep learning manifold learning feedback |
url |
http://journal.frontiersin.org/article/10.3389/fams.2018.00014/full |
work_keys_str_mv |
AT charleskchui constructionofneuralnetworksforrealizationoflocalizeddeeplearning AT charleskchui constructionofneuralnetworksforrealizationoflocalizeddeeplearning AT shaobolin constructionofneuralnetworksforrealizationoflocalizeddeeplearning AT dingxuanzhou constructionofneuralnetworksforrealizationoflocalizeddeeplearning |
_version_ |
1725057653733851136 |