Generalized Eigenvalue Proximal Support Vector Machine for Functional Data Classification

Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional g...

Full description

Bibliographic Details
Main Authors: Yuanyuan Chen, Zhixia Yang
Format: Article
Language:English
Published: MDPI AG 2021-05-01
Series:Symmetry
Subjects:
Online Access:https://www.mdpi.com/2073-8994/13/5/833
Description
Summary:Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional generalized eigenvalue proximal support vector machine (FGEPSVM). Specifically, we find two nonparallel hyperplanes in function space, a positive functional hyperplane, and a functional negative hyperplane. The former is closest to the positive functional data and furthest from the negative functional data, while the latter has the opposite properties. By introducing the orthonormal basis, the problem in function space is transformed into the ones in vector space. It should be pointed out that the higher-order derivative information is applied from two aspects. We apply the derivatives alone or the weighted linear combination of the original function and the derivatives. It can be expected that to improve the classification accuracy by using more data information. Experiments on artificial datasets and benchmark datasets show the effectiveness of our FGEPSVM for functional data classification.
ISSN:2073-8994