Exact Test of Independence Using Mutual Information
Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independe...
Main Authors: | Shawn D. Pethel, Daniel W. Hahs |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2014-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/16/5/2839 |
Similar Items
-
Measuring Independence between Statistical Randomness Tests by Mutual Information
by: Jorge Augusto Karell-Albo, et al.
Published: (2020-07-01) -
An Estimator of Mutual Information and its Application to Independence Testing
by: Joe Suzuki
Published: (2016-03-01) -
Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information
by: Fabien Mathy
Published: (2010-03-01) -
Generalized Mutual Information
by: Zhiyi Zhang
Published: (2020-06-01) -
Error Exponents and <em>α</em>-Mutual Information
by: Sergio Verdú
Published: (2021-02-01)