Pilot Study on Gait Classification Using fNIRS Signals

Rehabilitation training is essential for motor dysfunction patients, and the training through their subjective motion intention, comparing to passive training, is more conducive to rehabilitation. This study proposes a method to identify motion intention of different walking states under the normal...

Full description

Bibliographic Details
Main Authors: Hedian Jin, Chunguang Li, Jiacheng Xu
Format: Article
Language:English
Published: Hindawi Limited 2018-01-01
Series:Computational Intelligence and Neuroscience
Online Access:http://dx.doi.org/10.1155/2018/7403471
Description
Summary:Rehabilitation training is essential for motor dysfunction patients, and the training through their subjective motion intention, comparing to passive training, is more conducive to rehabilitation. This study proposes a method to identify motion intention of different walking states under the normal environment, by using the functional near-infrared spectroscopy (fNIRS) technology. Twenty-two healthy subjects were recruited to walk with three different gaits (including small-step with low-speed, small-step with midspeed, midstep with low-speed). The wavelet packet decomposition was used to find out the main characteristic channels in different motion states, and these channels with links in frequency and space were combined to define as feature vectors. According to different permutations and combinations of all feature vectors, a library for support vector machines (libSVM) was used to achieve the best recognition model. Finally, the accuracy rate of these three walking states was 78.79%. This study implemented the classification of different states’ motion intention by using the fNIRS technology. It laid a foundation to apply the classified motion intention of different states timely, to help severe motor dysfunction patients control a walking-assistive device for rehabilitation training, so as to help them restore independent walking abilities and reduce the economic burdens on society.
ISSN:1687-5265
1687-5273