Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), <em>I(X,Y)</em>, between random variables <em>X</em>,<em>Y</em>, which is compatible with prescribed joint expe...

Full description

Bibliographic Details
Main Authors: Rui A. P. Perdigão, Carlos A. L. Pires
Format: Article
Language:English
Published: MDPI AG 2012-06-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/14/6/1103
id doaj-4bfbc1d66a5c45459a28af74df7786f8
record_format Article
spelling doaj-4bfbc1d66a5c45459a28af74df7786f82020-11-25T00:47:18ZengMDPI AGEntropy1099-43002012-06-011461103112610.3390/e14061103Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and PropertiesRui A. P. PerdigãoCarlos A. L. PiresThe application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), <em>I(X,Y)</em>, between random variables <em>X</em>,<em>Y</em>, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (<em>I<sub>g</sub></em>), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (<em>I<sub>ng</sub></em>), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order <em>p</em> are bounded within a compact set defined by Schwarz-like inequalities, where <em>I<sub>ng</sub></em> grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating <em>I<sub>ng</sub></em> between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (<em>snr</em>) variances. We have studied the effect of varying <em>snr</em> on <em>I<sub>g</sub></em> and <em>I<sub>ng</sub></em> under several signal/noise scenarios.http://www.mdpi.com/1099-4300/14/6/1103mutual informationnon-Gaussianitymaximum entropy distributionsnon‑Gaussian noise
collection DOAJ
language English
format Article
sources DOAJ
author Rui A. P. Perdigão
Carlos A. L. Pires
spellingShingle Rui A. P. Perdigão
Carlos A. L. Pires
Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
Entropy
mutual information
non-Gaussianity
maximum entropy distributions
non‑Gaussian noise
author_facet Rui A. P. Perdigão
Carlos A. L. Pires
author_sort Rui A. P. Perdigão
title Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
title_short Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
title_full Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
title_fullStr Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
title_full_unstemmed Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
title_sort minimum mutual information and non-gaussianity through the maximum entropy method: theory and properties
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2012-06-01
description The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), <em>I(X,Y)</em>, between random variables <em>X</em>,<em>Y</em>, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (<em>I<sub>g</sub></em>), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (<em>I<sub>ng</sub></em>), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order <em>p</em> are bounded within a compact set defined by Schwarz-like inequalities, where <em>I<sub>ng</sub></em> grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating <em>I<sub>ng</sub></em> between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (<em>snr</em>) variances. We have studied the effect of varying <em>snr</em> on <em>I<sub>g</sub></em> and <em>I<sub>ng</sub></em> under several signal/noise scenarios.
topic mutual information
non-Gaussianity
maximum entropy distributions
non‑Gaussian noise
url http://www.mdpi.com/1099-4300/14/6/1103
work_keys_str_mv AT ruiapperdigao minimummutualinformationandnongaussianitythroughthemaximumentropymethodtheoryandproperties
AT carlosalpires minimummutualinformationandnongaussianitythroughthemaximumentropymethodtheoryandproperties
_version_ 1725260721153900544