Non-linear Information Inequalities

We construct non-linear information inequalities from Mat´uˇs’ infinite series of linear information inequalities. Each single non-linear inequality is sufficiently strong to prove that the closure of the set of all entropy functions is not polyhedral for four or more random variables, a...

Full description

Bibliographic Details
Main Authors: Terence Chan, Alex Grant
Format: Article
Language:English
Published: MDPI AG 2008-12-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/10/4/765/
id doaj-536c6c86801843e1a542b7ca0a307504
record_format Article
spelling doaj-536c6c86801843e1a542b7ca0a3075042020-11-24T22:47:31ZengMDPI AGEntropy1099-43002008-12-0110476577510.3390/e10040765Non-linear Information InequalitiesTerence ChanAlex GrantWe construct non-linear information inequalities from Mat´uˇs’ infinite series of linear information inequalities. Each single non-linear inequality is sufficiently strong to prove that the closure of the set of all entropy functions is not polyhedral for four or more random variables, a fact that was already established using the series of linear inequalities. To the best of our knowledge, they are the first non-trivial examples of non-linear information inequalities.http://www.mdpi.com/1099-4300/10/4/765/Entropyentropy functionnonlinear information inequalitynonshannon type information inequality
collection DOAJ
language English
format Article
sources DOAJ
author Terence Chan
Alex Grant
spellingShingle Terence Chan
Alex Grant
Non-linear Information Inequalities
Entropy
Entropy
entropy function
nonlinear information inequality
nonshannon type information inequality
author_facet Terence Chan
Alex Grant
author_sort Terence Chan
title Non-linear Information Inequalities
title_short Non-linear Information Inequalities
title_full Non-linear Information Inequalities
title_fullStr Non-linear Information Inequalities
title_full_unstemmed Non-linear Information Inequalities
title_sort non-linear information inequalities
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2008-12-01
description We construct non-linear information inequalities from Mat´uˇs’ infinite series of linear information inequalities. Each single non-linear inequality is sufficiently strong to prove that the closure of the set of all entropy functions is not polyhedral for four or more random variables, a fact that was already established using the series of linear inequalities. To the best of our knowledge, they are the first non-trivial examples of non-linear information inequalities.
topic Entropy
entropy function
nonlinear information inequality
nonshannon type information inequality
url http://www.mdpi.com/1099-4300/10/4/765/
work_keys_str_mv AT terencechan nonlinearinformationinequalities
AT alexgrant nonlinearinformationinequalities
_version_ 1725681540475650048