Summary: | 碩士 === 國立東華大學 === 應用數學系 === 92 ===
This work deals with independent component analysis (ICA) via minimizing the Kullback-Leibler (KL) divergence between the joint probability density function and the product of all marginal probability density functions of output components. To resolve the minimizing process, we explore estimation of marginal probability density functions based on weighted Parzen windows. The formulation leads to differentiable marginal entropies and KL divergence with respect to the demixing matrix. Approaches to estimating parameters of weighted Parzen windows explored here include the expectation-maximization (EM), the hierarchical clustering and the annealed EM algorithms. When combined with the natural gradient descent method, these approaches result in a class of new algorithms for independent component analysis. We compare their performance with existing ICA algorithms, including the PottsNICA, FastICA and JadeICA algorithms, for blind source separation of real signals.
|