Machine Learning with Squared-Loss Mutual Information

Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback&...

Full description

Bibliographic Details
Main Author: Masashi Sugiyama
Format: Article
Language:English
Published: MDPI AG 2012-12-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/15/1/80
id doaj-0df220e945de418d85a81f69e0dc4a51
record_format Article
spelling doaj-0df220e945de418d85a81f69e0dc4a512020-11-24T20:58:50ZengMDPI AGEntropy1099-43002012-12-011518011210.3390/e15010080Machine Learning with Squared-Loss Mutual InformationMasashi SugiyamaMutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.http://www.mdpi.com/1099-4300/15/1/80squared-loss mutual informationPearson divergencedensity-ratio estimationindependence testingdimensionality reductionindependent component analysisobject matchingclusteringcausal inferencemachine learning
collection DOAJ
language English
format Article
sources DOAJ
author Masashi Sugiyama
spellingShingle Masashi Sugiyama
Machine Learning with Squared-Loss Mutual Information
Entropy
squared-loss mutual information
Pearson divergence
density-ratio estimation
independence testing
dimensionality reduction
independent component analysis
object matching
clustering
causal inference
machine learning
author_facet Masashi Sugiyama
author_sort Masashi Sugiyama
title Machine Learning with Squared-Loss Mutual Information
title_short Machine Learning with Squared-Loss Mutual Information
title_full Machine Learning with Squared-Loss Mutual Information
title_fullStr Machine Learning with Squared-Loss Mutual Information
title_full_unstemmed Machine Learning with Squared-Loss Mutual Information
title_sort machine learning with squared-loss mutual information
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2012-12-01
description Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.
topic squared-loss mutual information
Pearson divergence
density-ratio estimation
independence testing
dimensionality reduction
independent component analysis
object matching
clustering
causal inference
machine learning
url http://www.mdpi.com/1099-4300/15/1/80
work_keys_str_mv AT masashisugiyama machinelearningwithsquaredlossmutualinformation
_version_ 1716784374391767040