Summary: | Orthogonal Partial Least Squares (OPLS) methods are aimed at finding the dominant factors from predictor variables that can maximize cross-covariance between the factors themselves and response variables while a high correlation between them should also be satisfied at the same time. Compared with discriminant analysis like Linear Discriminant Analysis, OPLS simultaneously considers covariance maximization and data fitting. However, unlike discriminant analysis that focuses on between-group discriminability, OPLS concentrates on cross-covariance that already contains no discriminant information. This deepens the difficulty of finding effective dominant factors. To rectify such a drawback of OPLS, this study proposes 1) successively orthogonal deflation in constrained noisy subspace and 2) isotropic space transform for enhancing OPLS. The former explores successively orthogonal projective vectors in subspace and iteratively updates the weighted signal space. The latter converts the dimensions with unequal influences into those with equal ones for correcting distortions. The two proposed rectifications are implemented in three types of Maximum Covariance Analysis (MCA) for examining the gradually changing functionalities, respectively - i) Successive Subspace-MCA, ii) Isotropic Subspace-MCA, and iii) Successive Isotropic Subspace-MCA. Experiments on open datasets were carried out to compare the proposed approaches with the baseline. The experimental results showed that the proposed rectifications maximized cross-covariance while fitting data well, thereby substantiating the effectiveness of the proposed idea.
|