On Geometry of Information Flow for Causal Inference
Causal inference is perhaps one of the most fundamental concepts in science, beginning originally from the works of some of the ancient philosophers, through today, but also weaved strongly in current work from statisticians, machine learning experts, and scientists from many other fields. This pape...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-03-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/4/396 |
id |
doaj-a4d6055da73840e2999e5fe7a41cdcce |
---|---|
record_format |
Article |
spelling |
doaj-a4d6055da73840e2999e5fe7a41cdcce2020-11-25T02:39:51ZengMDPI AGEntropy1099-43002020-03-012239639610.3390/e22040396On Geometry of Information Flow for Causal InferenceSudam Surasinghe0Erik M. Bollt1Department of Mathematics, Clarkson University, Potsdam, NY 13699, USADepartment of Electrical and Computer Engineering, Clarkson Center for Complex Systems Science (C3S2), Clarkson University, Potsdam, NY 13699, USACausal inference is perhaps one of the most fundamental concepts in science, beginning originally from the works of some of the ancient philosophers, through today, but also weaved strongly in current work from statisticians, machine learning experts, and scientists from many other fields. This paper takes the perspective of information flow, which includes the Nobel prize winning work on Granger-causality, and the recently highly popular transfer entropy, these being probabilistic in nature. Our main contribution will be to develop analysis tools that will allow a geometric interpretation of information flow as a causal inference indicated by positive transfer entropy. We will describe the effective dimensionality of an underlying manifold as projected into the outcome space that summarizes information flow. Therefore, contrasting the probabilistic and geometric perspectives, we will introduce a new measure of causal inference based on the fractal correlation dimension conditionally applied to competing explanations of future forecasts, which we will write <inline-formula> <math display="inline"> <semantics> <mrow> <mi>G</mi> <mi>e</mi> <mi>o</mi> <msub> <mi>C</mi> <mrow> <mi>y</mi> <mo>→</mo> <mi>x</mi> </mrow> </msub> </mrow> </semantics> </math> </inline-formula>. This avoids some of the boundedness issues that we show exist for the transfer entropy, <inline-formula> <math display="inline"> <semantics> <msub> <mi>T</mi> <mrow> <mi>y</mi> <mo>→</mo> <mi>x</mi> </mrow> </msub> </semantics> </math> </inline-formula>. We will highlight our discussions with data developed from synthetic models of successively more complex nature: these include the Hénon map example, and finally a real physiological example relating breathing and heart rate function.https://www.mdpi.com/1099-4300/22/4/396causal inferencetransfer entropydifferential entropycorrelation dimensionPinsker’s inequalityFrobenius–Perron operator |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Sudam Surasinghe Erik M. Bollt |
spellingShingle |
Sudam Surasinghe Erik M. Bollt On Geometry of Information Flow for Causal Inference Entropy causal inference transfer entropy differential entropy correlation dimension Pinsker’s inequality Frobenius–Perron operator |
author_facet |
Sudam Surasinghe Erik M. Bollt |
author_sort |
Sudam Surasinghe |
title |
On Geometry of Information Flow for Causal Inference |
title_short |
On Geometry of Information Flow for Causal Inference |
title_full |
On Geometry of Information Flow for Causal Inference |
title_fullStr |
On Geometry of Information Flow for Causal Inference |
title_full_unstemmed |
On Geometry of Information Flow for Causal Inference |
title_sort |
on geometry of information flow for causal inference |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2020-03-01 |
description |
Causal inference is perhaps one of the most fundamental concepts in science, beginning originally from the works of some of the ancient philosophers, through today, but also weaved strongly in current work from statisticians, machine learning experts, and scientists from many other fields. This paper takes the perspective of information flow, which includes the Nobel prize winning work on Granger-causality, and the recently highly popular transfer entropy, these being probabilistic in nature. Our main contribution will be to develop analysis tools that will allow a geometric interpretation of information flow as a causal inference indicated by positive transfer entropy. We will describe the effective dimensionality of an underlying manifold as projected into the outcome space that summarizes information flow. Therefore, contrasting the probabilistic and geometric perspectives, we will introduce a new measure of causal inference based on the fractal correlation dimension conditionally applied to competing explanations of future forecasts, which we will write <inline-formula> <math display="inline"> <semantics> <mrow> <mi>G</mi> <mi>e</mi> <mi>o</mi> <msub> <mi>C</mi> <mrow> <mi>y</mi> <mo>→</mo> <mi>x</mi> </mrow> </msub> </mrow> </semantics> </math> </inline-formula>. This avoids some of the boundedness issues that we show exist for the transfer entropy, <inline-formula> <math display="inline"> <semantics> <msub> <mi>T</mi> <mrow> <mi>y</mi> <mo>→</mo> <mi>x</mi> </mrow> </msub> </semantics> </math> </inline-formula>. We will highlight our discussions with data developed from synthetic models of successively more complex nature: these include the Hénon map example, and finally a real physiological example relating breathing and heart rate function. |
topic |
causal inference transfer entropy differential entropy correlation dimension Pinsker’s inequality Frobenius–Perron operator |
url |
https://www.mdpi.com/1099-4300/22/4/396 |
work_keys_str_mv |
AT sudamsurasinghe ongeometryofinformationflowforcausalinference AT erikmbollt ongeometryofinformationflowforcausalinference |
_version_ |
1724784406107783168 |