Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data
A common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships....
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/7/4986 |
id |
doaj-22cd567d5adb4a1f84fb409bb70bcd99 |
---|---|
record_format |
Article |
spelling |
doaj-22cd567d5adb4a1f84fb409bb70bcd992020-11-24T22:20:52ZengMDPI AGEntropy1099-43002015-07-011774986499910.3390/e17074986e17074986Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional DataJayajit Das '0Sayak Mukherjee1Susan E. Hodge2Battelle Center for Mathematical Medicine, Research Institute at the Nationwide Children's Hospital, 700 Children's Drive, OH 43205, USABattelle Center for Mathematical Medicine, Research Institute at the Nationwide Children's Hospital, 700 Children's Drive, OH 43205, USABattelle Center for Mathematical Medicine, Research Institute at the Nationwide Children's Hospital, 700 Children's Drive, OH 43205, USAA common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt) approach that estimates Q(x) based only on the available data, namely, P(y). The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.http://www.mdpi.com/1099-4300/17/7/4986maximum entropyjoint probability distributionmicrobial ecology |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Jayajit Das ' Sayak Mukherjee Susan E. Hodge |
spellingShingle |
Jayajit Das ' Sayak Mukherjee Susan E. Hodge Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data Entropy maximum entropy joint probability distribution microbial ecology |
author_facet |
Jayajit Das ' Sayak Mukherjee Susan E. Hodge |
author_sort |
Jayajit Das ' |
title |
Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data |
title_short |
Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data |
title_full |
Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data |
title_fullStr |
Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data |
title_full_unstemmed |
Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data |
title_sort |
maximum entropy estimation of probability distribution of variables in higher dimensions from lower dimensional data |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2015-07-01 |
description |
A common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt) approach that estimates Q(x) based only on the available data, namely, P(y). The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples. |
topic |
maximum entropy joint probability distribution microbial ecology |
url |
http://www.mdpi.com/1099-4300/17/7/4986 |
work_keys_str_mv |
AT jayajitdas maximumentropyestimationofprobabilitydistributionofvariablesinhigherdimensionsfromlowerdimensionaldata AT sayakmukherjee maximumentropyestimationofprobabilitydistributionofvariablesinhigherdimensionsfromlowerdimensionaldata AT susanehodge maximumentropyestimationofprobabilitydistributionofvariablesinhigherdimensionsfromlowerdimensionaldata |
_version_ |
1725773433125470208 |