A Utility-Based Approach to Some Information Measures

We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We...

Full description

Bibliographic Details
Main Authors: Sven Sandow, Jinggang Huang, Craig Friedman
Format: Article
Language:English
Published: MDPI AG 2007-01-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/9/1/1/
Description
Summary:We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, share many of the properties of entropy andrelative entropy, such as the data processing inequality and the second law of thermodynamics.We formulate an important statistical learning problem – probability estimation – in terms of ageneralized relative entropy. The solution of this problem reflects general risk preferences via theutility function; moreover, the solution is optimal in a sense of robust absolute performance.
ISSN:1099-4300