A Comparison Study of Principal Component Analysis and Nonlinear Principal Component Analysis
In the field of data analysis, it is important to reduce the dimensionality of data, because it will help to understand the data, extract new knowledge from the data, and decrease the computational cost. Principal Component Analysis (PCA) [1, 7, 19] has been applied in various areas as a method of d...
Other Authors: | |
---|---|
Format: | Others |
Language: | English English |
Published: |
Florida State University
|
Subjects: | |
Online Access: | http://purl.flvc.org/fsu/fd/FSU_migr_etd-0704 |
id |
ndltd-fsu.edu-oai-fsu.digital.flvc.org-fsu_168881 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-fsu.edu-oai-fsu.digital.flvc.org-fsu_1688812019-07-01T04:04:46Z A Comparison Study of Principal Component Analysis and Nonlinear Principal Component Analysis Wu, Rui (authoraut) Magnan, Jerry F. (professor directing thesis) Bellenot, Steven (committee member) Sussman, Mark (committee member) Department of Mathematics (degree granting department) Florida State University (degree granting institution) Text text Florida State University English eng 1 online resource computer application/pdf In the field of data analysis, it is important to reduce the dimensionality of data, because it will help to understand the data, extract new knowledge from the data, and decrease the computational cost. Principal Component Analysis (PCA) [1, 7, 19] has been applied in various areas as a method of dimensionality reduction. Nonlinear Principal Component Analysis (NLPCA) [1, 7, 19] was originally introduced as a nonlinear generalization of PCA. Both of the methods were tested on various artificial and natural datasets sampled from: "F(x) = sin(x) + x", the Lorenz Attractor, and sunspot data. The results from the experiments have been analyzed and compared. Generally speaking, NLPCA can explain more variance than a neural network PCA (NN PCA) in lower dimensions. However, as a result of increasing the dimension, the NLPCA approximation will eventually loss its advantage. Finally, we introduce a new combination of NN PCA and NLPCA, and analyze and compare its performance. A Thesis submitted to the Department of Mathematics in partial fulfillment of the requirements for the degree of Master of Science. Degree Awarded: Spring semester, 2007. Date of Defense: December 1, 2006. FUV, Singular Value Decomposition, Variance, Principal Component Analysis, PCA, Neural Network, NN, Nonlinear Principal Component Analysis, NLPCA, Dimension Reduction, SVD Includes bibliographical references. Jerry F. Magnan, Professor Directing Thesis; Steven Bellenot, Committee Member; Mark Sussman, Committee Member. Mathematics FSU_migr_etd-0704 http://purl.flvc.org/fsu/fd/FSU_migr_etd-0704 http://diginole.lib.fsu.edu/islandora/object/fsu%3A168881/datastream/TN/view/Comparison%20Study%20of%20Principal%20Component%20Analysis%20and%20Nonlinear%20Principal%20Component%20%20%20%20%20%20%20%20%20%20Analysis.jpg |
collection |
NDLTD |
language |
English English |
format |
Others
|
sources |
NDLTD |
topic |
Mathematics |
spellingShingle |
Mathematics A Comparison Study of Principal Component Analysis and Nonlinear Principal Component Analysis |
description |
In the field of data analysis, it is important to reduce the dimensionality of data, because it will help to understand the data, extract new knowledge from the data, and decrease the computational cost. Principal Component Analysis (PCA) [1, 7, 19] has been applied in various areas as a method of dimensionality reduction. Nonlinear Principal Component Analysis (NLPCA) [1, 7, 19] was originally introduced as a nonlinear generalization of PCA. Both of the methods were tested on various artificial and natural datasets sampled from: "F(x) = sin(x) + x", the Lorenz Attractor, and sunspot data. The results from the experiments have been analyzed and compared. Generally speaking, NLPCA can explain more variance than a neural network PCA (NN PCA) in lower dimensions. However, as a result of increasing the dimension, the NLPCA approximation will eventually loss its advantage. Finally, we introduce a new combination of NN PCA and NLPCA, and analyze and compare its performance. === A Thesis submitted to the Department of Mathematics in partial fulfillment of the
requirements for the degree of Master of Science. === Degree Awarded: Spring semester, 2007. === Date of Defense: December 1, 2006. === FUV, Singular Value Decomposition, Variance, Principal Component Analysis, PCA, Neural Network, NN, Nonlinear Principal Component Analysis, NLPCA, Dimension Reduction, SVD === Includes bibliographical references. === Jerry F. Magnan, Professor Directing Thesis; Steven Bellenot, Committee Member; Mark Sussman, Committee Member. |
author2 |
Wu, Rui (authoraut) |
author_facet |
Wu, Rui (authoraut) |
title |
A Comparison Study of Principal Component Analysis and Nonlinear Principal Component
Analysis |
title_short |
A Comparison Study of Principal Component Analysis and Nonlinear Principal Component
Analysis |
title_full |
A Comparison Study of Principal Component Analysis and Nonlinear Principal Component
Analysis |
title_fullStr |
A Comparison Study of Principal Component Analysis and Nonlinear Principal Component
Analysis |
title_full_unstemmed |
A Comparison Study of Principal Component Analysis and Nonlinear Principal Component
Analysis |
title_sort |
comparison study of principal component analysis and nonlinear principal component
analysis |
publisher |
Florida State University |
url |
http://purl.flvc.org/fsu/fd/FSU_migr_etd-0704 |
_version_ |
1719215227040432128 |