Exact Probability Distribution versus Entropy

The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing stra...

Full description

Bibliographic Details
Main Author: Kerstin Andersson
Format: Article
Language:English
Published: MDPI AG 2014-10-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/10/5198
Description
Summary:The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic  sizes of alphabets and words (100), the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
ISSN:1099-4300