Evaluating Approximations and Heuristic Measures of Integrated Information

Integrated information theory (IIT) proposes a measure of integrated information, termed Phi (Φ), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating Φ itself is currently possible only for very small model systems and far from co...

Full description

Bibliographic Details
Main Authors: André Sevenius Nilsen, Bjørn Erik Juel, William Marshall
Format: Article
Language:English
Published: MDPI AG 2019-05-01
Series:Entropy
Subjects:
IIT
Phi
Online Access:https://www.mdpi.com/1099-4300/21/5/525
id doaj-785665912037404eb2daab0cc5ddc1dc
record_format Article
spelling doaj-785665912037404eb2daab0cc5ddc1dc2020-11-25T00:09:04ZengMDPI AGEntropy1099-43002019-05-0121552510.3390/e21050525e21050525Evaluating Approximations and Heuristic Measures of Integrated InformationAndré Sevenius Nilsen0Bjørn Erik Juel1William Marshall2Brain Signalling Group, Department of Physiology, Institute of Basic Medicine, University of Oslo, Sognsvannsveien 9, 0315 Oslo, NorwayBrain Signalling Group, Department of Physiology, Institute of Basic Medicine, University of Oslo, Sognsvannsveien 9, 0315 Oslo, NorwayDepartment of Psychiatry, University of Wisconsin, Madison, WI 53719, USAIntegrated information theory (IIT) proposes a measure of integrated information, termed Phi (&#934;), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating &#934; itself is currently possible only for very small model systems and far from computable for the kinds of system typically associated with consciousness (brains). Here, we considered several proposed heuristic measures and computational approximations, some of which can be applied to larger systems, and tested if they correlate well with &#934;. While these measures and approximations capture intuitions underlying IIT and some have had success in practical applications, it has not been shown that they actually quantify the type of integrated information specified by the latest version of IIT and, thus, whether they can be used to test the theory. In this study, we evaluated these approximations and heuristic measures considering how well they estimated the &#934; values of model systems and not on the basis of practical or clinical considerations. To do this, we simulated networks consisting of 3&#8722;6 binary linear threshold nodes randomly connected with excitatory and inhibitory connections. For each system, we then constructed the system&#8217;s state transition probability matrix (TPM) and generated observed data over time from all possible initial conditions. We then calculated &#934;, approximations to &#934;, and measures based on state differentiation, coalition entropy, state uniqueness, and integrated information. Our findings suggest that &#934; can be approximated closely in small binary systems by using one or more of the readily available approximations (<i>r</i> &gt; 0.95) but without major reductions in computational demands. Furthermore, the maximum value of &#934; across states (a state-independent quantity) correlated strongly with measures of signal complexity (LZ, <i>r</i><sub>s</sub> = 0.722), decoder-based integrated information (&#934;*, <i>r</i><sub>s</sub> = 0.816), and state differentiation (D1, <i>r</i><sub>s</sub> = 0.827). These measures could allow for the efficient estimation of a system&#8217;s capacity for high &#934; or function as accurate predictors of low- (but not high-)&#934; systems. While it is uncertain whether the results extend to larger systems or systems with other dynamics, we stress the importance that measures aimed at being practical alternatives to &#934; be, at a minimum, rigorously tested in an environment where the ground truth can be established.https://www.mdpi.com/1099-4300/21/5/525integrated information theorydifferentiationintegrationcomplexityconsciousnesscomputationalIITPhi
collection DOAJ
language English
format Article
sources DOAJ
author André Sevenius Nilsen
Bjørn Erik Juel
William Marshall
spellingShingle André Sevenius Nilsen
Bjørn Erik Juel
William Marshall
Evaluating Approximations and Heuristic Measures of Integrated Information
Entropy
integrated information theory
differentiation
integration
complexity
consciousness
computational
IIT
Phi
author_facet André Sevenius Nilsen
Bjørn Erik Juel
William Marshall
author_sort André Sevenius Nilsen
title Evaluating Approximations and Heuristic Measures of Integrated Information
title_short Evaluating Approximations and Heuristic Measures of Integrated Information
title_full Evaluating Approximations and Heuristic Measures of Integrated Information
title_fullStr Evaluating Approximations and Heuristic Measures of Integrated Information
title_full_unstemmed Evaluating Approximations and Heuristic Measures of Integrated Information
title_sort evaluating approximations and heuristic measures of integrated information
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2019-05-01
description Integrated information theory (IIT) proposes a measure of integrated information, termed Phi (&#934;), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating &#934; itself is currently possible only for very small model systems and far from computable for the kinds of system typically associated with consciousness (brains). Here, we considered several proposed heuristic measures and computational approximations, some of which can be applied to larger systems, and tested if they correlate well with &#934;. While these measures and approximations capture intuitions underlying IIT and some have had success in practical applications, it has not been shown that they actually quantify the type of integrated information specified by the latest version of IIT and, thus, whether they can be used to test the theory. In this study, we evaluated these approximations and heuristic measures considering how well they estimated the &#934; values of model systems and not on the basis of practical or clinical considerations. To do this, we simulated networks consisting of 3&#8722;6 binary linear threshold nodes randomly connected with excitatory and inhibitory connections. For each system, we then constructed the system&#8217;s state transition probability matrix (TPM) and generated observed data over time from all possible initial conditions. We then calculated &#934;, approximations to &#934;, and measures based on state differentiation, coalition entropy, state uniqueness, and integrated information. Our findings suggest that &#934; can be approximated closely in small binary systems by using one or more of the readily available approximations (<i>r</i> &gt; 0.95) but without major reductions in computational demands. Furthermore, the maximum value of &#934; across states (a state-independent quantity) correlated strongly with measures of signal complexity (LZ, <i>r</i><sub>s</sub> = 0.722), decoder-based integrated information (&#934;*, <i>r</i><sub>s</sub> = 0.816), and state differentiation (D1, <i>r</i><sub>s</sub> = 0.827). These measures could allow for the efficient estimation of a system&#8217;s capacity for high &#934; or function as accurate predictors of low- (but not high-)&#934; systems. While it is uncertain whether the results extend to larger systems or systems with other dynamics, we stress the importance that measures aimed at being practical alternatives to &#934; be, at a minimum, rigorously tested in an environment where the ground truth can be established.
topic integrated information theory
differentiation
integration
complexity
consciousness
computational
IIT
Phi
url https://www.mdpi.com/1099-4300/21/5/525
work_keys_str_mv AT andreseveniusnilsen evaluatingapproximationsandheuristicmeasuresofintegratedinformation
AT bjørnerikjuel evaluatingapproximationsandheuristicmeasuresofintegratedinformation
AT williammarshall evaluatingapproximationsandheuristicmeasuresofintegratedinformation
_version_ 1725413103174156288