Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing

Network inference algorithms are valuable tools for the study of large-scale neuroimaging datasets. Multivariate transfer entropy is well suited for this task, being a model-free measure that captures nonlinear and lagged dependencies between time series to infer a minimal directed network model. Gr...

Full description

Bibliographic Details
Main Authors: Leonardo Novelli, Patricia Wollstadt, Pedro Mediano, Michael Wibral, Joseph T. Lizier
Format: Article
Language:English
Published: The MIT Press 2019-07-01
Series:Network Neuroscience
Subjects:
Online Access:https://www.mitpressjournals.org/doi/pdf/10.1162/netn_a_00092
id doaj-fb94782d698141329f07e727567cac29
record_format Article
spelling doaj-fb94782d698141329f07e727567cac292020-11-24T21:30:54ZengThe MIT PressNetwork Neuroscience2472-17512019-07-013382784710.1162/netn_a_00092netn_a_00092Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testingLeonardo Novelli0Patricia Wollstadt1Pedro Mediano2Michael Wibral3Joseph T. Lizier4Centre for Complex Systems, Faculty of Engineering, The University of Sydney, Sydney, AustraliaHonda Research Institute Europe, Offenbach am Main, GermanyComputational Neurodynamics Group, Department of Computing, Imperial College London, London, United KingdomCampus Institute for Dynamics of Biological Networks, Georg-August University, Göttingen, GermanyCentre for Complex Systems, Faculty of Engineering, The University of Sydney, Sydney, AustraliaNetwork inference algorithms are valuable tools for the study of large-scale neuroimaging datasets. Multivariate transfer entropy is well suited for this task, being a model-free measure that captures nonlinear and lagged dependencies between time series to infer a minimal directed network model. Greedy algorithms have been proposed to efficiently deal with high-dimensional datasets while avoiding redundant inferences and capturing synergistic effects. However, multiple statistical comparisons may inflate the false positive rate and are computationally demanding, which limited the size of previous validation studies. The algorithm we present—as implemented in the IDTxl open-source software—addresses these challenges by employing hierarchical statistical tests to control the family-wise error rate and to allow for efficient parallelization. The method was validated on synthetic datasets involving random networks of increasing size (up to 100 nodes), for both linear and nonlinear dynamics. The performance increased with the length of the time series, reaching consistently high precision, recall, and specificity (>98% on average) for 10,000 time samples. Varying the statistical significance threshold showed a more favorable precision-recall trade-off for longer time series. Both the network size and the sample size are one order of magnitude larger than previously demonstrated, showing feasibility for typical EEG and magnetoencephalography experiments.https://www.mitpressjournals.org/doi/pdf/10.1162/netn_a_00092NeuroimagingDirected connectivityEffective networkMultivariate transfer entropyInformation theoryNonlinear dynamicsStatistical inferenceNonparametric tests
collection DOAJ
language English
format Article
sources DOAJ
author Leonardo Novelli
Patricia Wollstadt
Pedro Mediano
Michael Wibral
Joseph T. Lizier
spellingShingle Leonardo Novelli
Patricia Wollstadt
Pedro Mediano
Michael Wibral
Joseph T. Lizier
Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
Network Neuroscience
Neuroimaging
Directed connectivity
Effective network
Multivariate transfer entropy
Information theory
Nonlinear dynamics
Statistical inference
Nonparametric tests
author_facet Leonardo Novelli
Patricia Wollstadt
Pedro Mediano
Michael Wibral
Joseph T. Lizier
author_sort Leonardo Novelli
title Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
title_short Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
title_full Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
title_fullStr Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
title_full_unstemmed Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
title_sort large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
publisher The MIT Press
series Network Neuroscience
issn 2472-1751
publishDate 2019-07-01
description Network inference algorithms are valuable tools for the study of large-scale neuroimaging datasets. Multivariate transfer entropy is well suited for this task, being a model-free measure that captures nonlinear and lagged dependencies between time series to infer a minimal directed network model. Greedy algorithms have been proposed to efficiently deal with high-dimensional datasets while avoiding redundant inferences and capturing synergistic effects. However, multiple statistical comparisons may inflate the false positive rate and are computationally demanding, which limited the size of previous validation studies. The algorithm we present—as implemented in the IDTxl open-source software—addresses these challenges by employing hierarchical statistical tests to control the family-wise error rate and to allow for efficient parallelization. The method was validated on synthetic datasets involving random networks of increasing size (up to 100 nodes), for both linear and nonlinear dynamics. The performance increased with the length of the time series, reaching consistently high precision, recall, and specificity (>98% on average) for 10,000 time samples. Varying the statistical significance threshold showed a more favorable precision-recall trade-off for longer time series. Both the network size and the sample size are one order of magnitude larger than previously demonstrated, showing feasibility for typical EEG and magnetoencephalography experiments.
topic Neuroimaging
Directed connectivity
Effective network
Multivariate transfer entropy
Information theory
Nonlinear dynamics
Statistical inference
Nonparametric tests
url https://www.mitpressjournals.org/doi/pdf/10.1162/netn_a_00092
work_keys_str_mv AT leonardonovelli largescaledirectednetworkinferencewithmultivariatetransferentropyandhierarchicalstatisticaltesting
AT patriciawollstadt largescaledirectednetworkinferencewithmultivariatetransferentropyandhierarchicalstatisticaltesting
AT pedromediano largescaledirectednetworkinferencewithmultivariatetransferentropyandhierarchicalstatisticaltesting
AT michaelwibral largescaledirectednetworkinferencewithmultivariatetransferentropyandhierarchicalstatisticaltesting
AT josephtlizier largescaledirectednetworkinferencewithmultivariatetransferentropyandhierarchicalstatisticaltesting
_version_ 1725961140498857984