Statistical tools for the analysis of event-related potentials in electroencephalograms

Since its first use in human in 1929, the electroencephalogram (EEG) has become one of the most important diagnostic tool in clinical neurophysiology. However, their use in clinical studies is limited because the huge quantity of collected information is complicated to treat. Indeed, it is very diff...

Full description

Bibliographic Details
Main Author: Bugli, Céline
Format: Others
Language:en
Published: Universite catholique de Louvain 2006
Subjects:
EEG
ICA
Online Access:http://edoc.bib.ucl.ac.be:81/ETD-db/collection/available/BelnUcetd-06182006-153528/
Description
Summary:Since its first use in human in 1929, the electroencephalogram (EEG) has become one of the most important diagnostic tool in clinical neurophysiology. However, their use in clinical studies is limited because the huge quantity of collected information is complicated to treat. Indeed, it is very difficult to have an overall picture of this multivariate problem. In addition to the impressive quantity of data to be treated, an intrinsic problem with electroencephalograms is that the signals are "contaminated" by body signals not directly related to cerebral activity. However, these signals do not interest us directly to evaluate treatment effect on the brain. Removing these signals known as "parasitic noise" from electroencephalograms is a difficult task. We use clinical data kindly made available by the pharmaceutical company Eli Lilly (Lilly Clinical Operations S.A., Louvain-la-Neuve, Belgium). Particular types of analyses were already carried out on these data, most based on frequency bands. They mainly confirmed the enormous potential of EEG in clinical studies without much insight in the understanding of treatment effect on the brain. The aim of this thesis is to propose and evaluate a panel of statistical techniques to clean and to analyze electroencephalograms. The first presented tool enables to align curves such as selected parts of EEGs before any further statistical treatment. Indeed, when monitoring some continuous process on similar units (like patients in a clinical study), one often notices a typical pattern common to all curves but with variation both in amplitude and dynamics across curves. In particular, typical peaks could be shifted from unit to unit. This complicates the statistical analysis of sample of curves. For example, the cross-sectional average usually does not reflect a typical curve pattern: due to shifts, the signal structure is smeared or might even disappear. Another of the presented tools is based on the preliminary linear decomposition of EEGs into statistically independent signals. This decomposition provides on the one hand an effective cleaning method and on the other hand a considerable reduction of the quantity of data to be analyzed. The technique of decomposition of our signals in statistically independent signals is a well-known technique in physics primarily used to unmix sound signals. This technique is named Independent Component Analysis or ICA. The last studied tool is functional ANOVA. The analysis of longitudinal curve data is a methodological and computational challenge for statisticians. Such data are often generated in biomedical studies. Most of the time, the statistical analysis focuses on simple summary measures, thereby discarding potentially important information. We propose to model these curves using non parametric regression techniques based on splines.