A Recipe for the Estimation of Information Flow in a Dynamical System

Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupl...

Full description

Bibliographic Details
Main Authors: Deniz Gencaga, Kevin H. Knuth, William B. Rossow
Format: Article
Language:English
Published: MDPI AG 2015-01-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/17/1/438
id doaj-d05688f1df204c0abf96d04259ec3578
record_format Article
spelling doaj-d05688f1df204c0abf96d04259ec35782020-11-25T00:15:32ZengMDPI AGEntropy1099-43002015-01-0117143847010.3390/e17010438e17010438A Recipe for the Estimation of Information Flow in a Dynamical SystemDeniz Gencaga0Kevin H. Knuth1William B. Rossow2NOAA-CREST, The City College of New York, New York, NY, 10031, USADepts. of Physics and Informatics, University at Albany (SUNY), Albany, NY 12222, USANOAA-CREST, The City College of New York, New York, NY, 10031, USAInformation-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a flow can reveal relationships among the system variables. Since the MI is a symmetric quantity; an asymmetric quantity, called Transfer Entropy (TE), has been proposed to estimate the directionality of the coupling. However, accurate estimation of entropy-based measures is notoriously difficult. Every method has its own free tuning parameter(s) and there is no consensus on an optimal way of estimating the TE from a dataset. We propose a new methodology to estimate TE and apply a set of methods together as an accuracy cross-check to provide a reliable mathematical tool for any given data set. We demonstrate both the variability in TE estimation across techniques as well as the benefits of the proposed methodology to reliably estimate the directionality of coupling among variables. http://www.mdpi.com/1099-4300/17/1/438transfer entropyinformation flowstatistical dependencymutual informationShannon entropyinformation-theoretical quantitiesLorenz equations
collection DOAJ
language English
format Article
sources DOAJ
author Deniz Gencaga
Kevin H. Knuth
William B. Rossow
spellingShingle Deniz Gencaga
Kevin H. Knuth
William B. Rossow
A Recipe for the Estimation of Information Flow in a Dynamical System
Entropy
transfer entropy
information flow
statistical dependency
mutual information
Shannon entropy
information-theoretical quantities
Lorenz equations
author_facet Deniz Gencaga
Kevin H. Knuth
William B. Rossow
author_sort Deniz Gencaga
title A Recipe for the Estimation of Information Flow in a Dynamical System
title_short A Recipe for the Estimation of Information Flow in a Dynamical System
title_full A Recipe for the Estimation of Information Flow in a Dynamical System
title_fullStr A Recipe for the Estimation of Information Flow in a Dynamical System
title_full_unstemmed A Recipe for the Estimation of Information Flow in a Dynamical System
title_sort recipe for the estimation of information flow in a dynamical system
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2015-01-01
description Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a flow can reveal relationships among the system variables. Since the MI is a symmetric quantity; an asymmetric quantity, called Transfer Entropy (TE), has been proposed to estimate the directionality of the coupling. However, accurate estimation of entropy-based measures is notoriously difficult. Every method has its own free tuning parameter(s) and there is no consensus on an optimal way of estimating the TE from a dataset. We propose a new methodology to estimate TE and apply a set of methods together as an accuracy cross-check to provide a reliable mathematical tool for any given data set. We demonstrate both the variability in TE estimation across techniques as well as the benefits of the proposed methodology to reliably estimate the directionality of coupling among variables.
topic transfer entropy
information flow
statistical dependency
mutual information
Shannon entropy
information-theoretical quantities
Lorenz equations
url http://www.mdpi.com/1099-4300/17/1/438
work_keys_str_mv AT denizgencaga arecipefortheestimationofinformationflowinadynamicalsystem
AT kevinhknuth arecipefortheestimationofinformationflowinadynamicalsystem
AT williambrossow arecipefortheestimationofinformationflowinadynamicalsystem
AT denizgencaga recipefortheestimationofinformationflowinadynamicalsystem
AT kevinhknuth recipefortheestimationofinformationflowinadynamicalsystem
AT williambrossow recipefortheestimationofinformationflowinadynamicalsystem
_version_ 1725386329897828352