Exact Probability Distribution versus Entropy
The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing stra...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2014-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/16/10/5198 |
id |
doaj-2b7bb41a81934de6904e3a5552d00ce6 |
---|---|
record_format |
Article |
spelling |
doaj-2b7bb41a81934de6904e3a5552d00ce62020-11-25T00:12:20ZengMDPI AGEntropy1099-43002014-10-0116105198521010.3390/e16105198e16105198Exact Probability Distribution versus EntropyKerstin Andersson0Department of Mathematics and Computer Science, Karlstad University, SE-651 88 Karlstad, SwedenThe problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic sizes of alphabets and words (100), the number of guesses can be estimated within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.http://www.mdpi.com/1099-4300/16/10/5198information entropysecurityguessing |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Kerstin Andersson |
spellingShingle |
Kerstin Andersson Exact Probability Distribution versus Entropy Entropy information entropy security guessing |
author_facet |
Kerstin Andersson |
author_sort |
Kerstin Andersson |
title |
Exact Probability Distribution versus Entropy |
title_short |
Exact Probability Distribution versus Entropy |
title_full |
Exact Probability Distribution versus Entropy |
title_fullStr |
Exact Probability Distribution versus Entropy |
title_full_unstemmed |
Exact Probability Distribution versus Entropy |
title_sort |
exact probability distribution versus entropy |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2014-10-01 |
description |
The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic sizes of alphabets and words (100), the number of guesses can be estimated within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided. |
topic |
information entropy security guessing |
url |
http://www.mdpi.com/1099-4300/16/10/5198 |
work_keys_str_mv |
AT kerstinandersson exactprobabilitydistributionversusentropy |
_version_ |
1725399739802845184 |