Range Entropy: A Bridge between Signal Complexity and Self-Similarity

Approximate entropy (<i>ApEn</i>) and sample entropy (<i>SampEn</i>) are widely used for temporal complexity analysis of real-world phenomena. However, their relationship with the Hurst exponent as a measure of self-similarity is not widely studied. Additionally, <i>ApE...

Full description

Bibliographic Details
Main Authors: Amir Omidvarnia, Mostefa Mesbah, Mangor Pedersen, Graeme Jackson
Format: Article
Language:English
Published: MDPI AG 2018-12-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/20/12/962
id doaj-ced7007efbeb481eb852a2e3bfba20f7
record_format Article
spelling doaj-ced7007efbeb481eb852a2e3bfba20f72020-11-24T23:24:15ZengMDPI AGEntropy1099-43002018-12-01201296210.3390/e20120962e20120962Range Entropy: A Bridge between Signal Complexity and Self-SimilarityAmir Omidvarnia0Mostefa Mesbah1Mangor Pedersen2Graeme Jackson3The Florey Institute of Neuroscience and Mental Health, Austin Campus, Heidelberg, VIC 3084, AustraliaDepartment of Electrical and Computer Engineering, Sultan Qaboos University, Muscat 123, OmanThe Florey Institute of Neuroscience and Mental Health, Austin Campus, Heidelberg, VIC 3084, AustraliaThe Florey Institute of Neuroscience and Mental Health, Austin Campus, Heidelberg, VIC 3084, AustraliaApproximate entropy (<i>ApEn</i>) and sample entropy (<i>SampEn</i>) are widely used for temporal complexity analysis of real-world phenomena. However, their relationship with the Hurst exponent as a measure of self-similarity is not widely studied. Additionally, <i>ApEn</i> and <i>SampEn</i> are susceptible to signal amplitude changes. A common practice for addressing this issue is to correct their input signal amplitude by its standard deviation. In this study, we first show, using simulations, that <i>ApEn</i> and <i>SampEn</i> are related to the Hurst exponent in their tolerance <i>r</i> and embedding dimension <i>m</i> parameters. We then propose a modification to <i>ApEn</i> and <i>SampEn</i> called <i>range entropy</i> or <i>RangeEn</i>. We show that <i>RangeEn</i> is more robust to nonstationary signal changes, and it has a more linear relationship with the Hurst exponent, compared to <i>ApEn</i> and <i>SampEn</i>. <i>RangeEn</i> is bounded in the tolerance <i>r</i>-plane between 0 (maximum entropy) and 1 (minimum entropy) and it has no need for signal amplitude correction. Finally, we demonstrate the clinical usefulness of signal entropy measures for characterisation of epileptic EEG data as a real-world example.https://www.mdpi.com/1099-4300/20/12/962approximate entropysample entropyrange entropycomplexity, self-similarityHurst exponent
collection DOAJ
language English
format Article
sources DOAJ
author Amir Omidvarnia
Mostefa Mesbah
Mangor Pedersen
Graeme Jackson
spellingShingle Amir Omidvarnia
Mostefa Mesbah
Mangor Pedersen
Graeme Jackson
Range Entropy: A Bridge between Signal Complexity and Self-Similarity
Entropy
approximate entropy
sample entropy
range entropy
complexity, self-similarity
Hurst exponent
author_facet Amir Omidvarnia
Mostefa Mesbah
Mangor Pedersen
Graeme Jackson
author_sort Amir Omidvarnia
title Range Entropy: A Bridge between Signal Complexity and Self-Similarity
title_short Range Entropy: A Bridge between Signal Complexity and Self-Similarity
title_full Range Entropy: A Bridge between Signal Complexity and Self-Similarity
title_fullStr Range Entropy: A Bridge between Signal Complexity and Self-Similarity
title_full_unstemmed Range Entropy: A Bridge between Signal Complexity and Self-Similarity
title_sort range entropy: a bridge between signal complexity and self-similarity
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2018-12-01
description Approximate entropy (<i>ApEn</i>) and sample entropy (<i>SampEn</i>) are widely used for temporal complexity analysis of real-world phenomena. However, their relationship with the Hurst exponent as a measure of self-similarity is not widely studied. Additionally, <i>ApEn</i> and <i>SampEn</i> are susceptible to signal amplitude changes. A common practice for addressing this issue is to correct their input signal amplitude by its standard deviation. In this study, we first show, using simulations, that <i>ApEn</i> and <i>SampEn</i> are related to the Hurst exponent in their tolerance <i>r</i> and embedding dimension <i>m</i> parameters. We then propose a modification to <i>ApEn</i> and <i>SampEn</i> called <i>range entropy</i> or <i>RangeEn</i>. We show that <i>RangeEn</i> is more robust to nonstationary signal changes, and it has a more linear relationship with the Hurst exponent, compared to <i>ApEn</i> and <i>SampEn</i>. <i>RangeEn</i> is bounded in the tolerance <i>r</i>-plane between 0 (maximum entropy) and 1 (minimum entropy) and it has no need for signal amplitude correction. Finally, we demonstrate the clinical usefulness of signal entropy measures for characterisation of epileptic EEG data as a real-world example.
topic approximate entropy
sample entropy
range entropy
complexity, self-similarity
Hurst exponent
url https://www.mdpi.com/1099-4300/20/12/962
work_keys_str_mv AT amiromidvarnia rangeentropyabridgebetweensignalcomplexityandselfsimilarity
AT mostefamesbah rangeentropyabridgebetweensignalcomplexityandselfsimilarity
AT mangorpedersen rangeentropyabridgebetweensignalcomplexityandselfsimilarity
AT graemejackson rangeentropyabridgebetweensignalcomplexityandselfsimilarity
_version_ 1725561110990422016