Entropy Rate Estimates for Natural Language—A New Extrapolation of Compressed Large-Scale Corpora
One of the fundamental questions about human language is whether its entropy rate is positive. The entropy rate measures the average amount of information communicated per unit time. The question about the entropy of language dates back to experiments by Shannon in 1951, but in 1990 Hilberg raised d...
Main Authors: | Ryosuke Takahira, Kumiko Tanaka-Ishii, Łukasz Dębowski |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2016-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/18/10/364 |
Similar Items
-
Study of Entropy Generation with Multi-Slip Effects in MHD Unsteady Flow of Viscous Fluid Past an Exponentially Stretching Surface
by: Sajjad Haider, et al.
Published: (2020-03-01) -
Entropy Analysis of 3D Non-Newtonian MHD Nanofluid Flow with Nonlinear Thermal Radiation Past over Exponential Stretched Surface
by: Muhammad Suleman, et al.
Published: (2018-12-01) -
A Fast Fractal Image Compression Method Based Entropy
by: M. Hassaballah, et al.
Published: (2005-08-01) -
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
by: Shuntaro Takahashi, et al.
Published: (2018-11-01) -
Entropy Production in the Expanding Universe
by: Mehrnoosh Farahmand, et al.
Published: (2017-11-01)