Information-Distilling Quantizers

IEEE Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer’s output and X, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-...

Full description

Bibliographic Details
Main Authors: Bhatt, Alankrita (Author), Nazer, Bobak (Author), Ordentlich, Or (Author), Polyanskiy, Yury (Author)
Format: Article
Language:English
Published: Institute of Electrical and Electronics Engineers (IEEE), 2022-07-19T12:10:37Z.
Subjects:
Online Access:Get fulltext
LEADER 01412 am a22001933u 4500
001 143839
042 |a dc 
100 1 0 |a Bhatt, Alankrita  |e author 
700 1 0 |a Nazer, Bobak  |e author 
700 1 0 |a Ordentlich, Or  |e author 
700 1 0 |a Polyanskiy, Yury  |e author 
245 0 0 |a Information-Distilling Quantizers 
260 |b Institute of Electrical and Electronics Engineers (IEEE),   |c 2022-07-19T12:10:37Z. 
856 |z Get fulltext  |u https://hdl.handle.net/1721.1/143839 
520 |a IEEE Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer’s output and X, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low I(X; Y ), where it is shown that, if X is binary, a constant fraction of the mutual information can always be preserved using O(log(1/I(X; Y ))) quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets 2 < |X| < ∞, it is established that an η-fraction of the mutual information can be preserved using roughly (log(|X|/I(X; Y )))η·(|X|-1) quantization levels. 
546 |a en 
655 7 |a Article 
773 |t 10.1109/TIT.2021.3059338 
773 |t IEEE Transactions on Information Theory