Context Quantization based on Minimum Description Length and Hierarchical Clustering

The code length of a source can be reduced effectively by using conditional probability distributions in a context model. However, the larger the size of the context model, the more difficult the estimation of the conditional probability distributions in the model by using the counting statistics fr...

Full description

Bibliographic Details
Main Authors: Chen Hui, Chen Jianhua
Format: Article
Language:English
Published: EDP Sciences 2016-01-01
Series:MATEC Web of Conferences
Online Access:http://dx.doi.org/10.1051/matecconf/20165601001
Description
Summary:The code length of a source can be reduced effectively by using conditional probability distributions in a context model. However, the larger the size of the context model, the more difficult the estimation of the conditional probability distributions in the model by using the counting statistics from the source symbols. In order to deal with this problem, a hierarchical clustering based context quantization algorithm is used to combine the conditional probability distributions in the context model to minimize the description length. The simulation results show that it is a good method for quantizing the context model. Meanwhile, the initial cluster centers and the number of classes do not need to be determined in advance any more. Thus, it can greatly simplify the quantizer design for the context quantization problem.
ISSN:2261-236X