Human facial neural activities and gesture recognition for machine-interfacing applications
M Hamedi1, Sh-Hussain Salleh2, TS Tan2, K Ismail2, J Ali3, C Dee-Uam4, C Pavaganun4, PP Yupapin51Faculty of Biomedical and Health Science Engineering, Department of Biomedical Instrumentation and Signal Processing, University of Technology Malaysia, Skudai, 2Centre for Biomedical Engineering Transpo...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Dove Medical Press
2011-12-01
|
Series: | International Journal of Nanomedicine |
Online Access: | http://www.dovepress.com/human-facial-neural-activities-and-gesture-recognition-for-machine-int-a8895 |
Summary: | M Hamedi1, Sh-Hussain Salleh2, TS Tan2, K Ismail2, J Ali3, C Dee-Uam4, C Pavaganun4, PP Yupapin51Faculty of Biomedical and Health Science Engineering, Department of Biomedical Instrumentation and Signal Processing, University of Technology Malaysia, Skudai, 2Centre for Biomedical Engineering Transportation Research Alliance, 3Institute of Advanced Photonics Science, Nanotechnology Research Alliance, University of Technology Malaysia (UTM), Johor Bahru, Malaysia; 4College of Innovative Management, Valaya Alongkorn Rajabhat University, Pathum Thani, 5Nanoscale Science and Engineering Research Alliance (N'SERA), Advanced Research Center for Photonics, Faculty of Science, King Mongkut's Institute of Technology Ladkrabang, Bangkok, ThailandAbstract: The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human–machine interface (HMI) technology utilizes human neural activities as input controllers for the machine. Recently, much work has been done on the specific application of facial electromyography (EMG)-based HMI, which have used limited and fixed numbers of facial gestures. In this work, a multipurpose interface is suggested that can support 2–11 control commands that can be applied to various HMI systems. The significance of this work is finding the most accurate facial gestures for any application with a maximum of eleven control commands. Eleven facial gesture EMGs are recorded from ten volunteers. Detected EMGs are passed through a band-pass filter and root mean square features are extracted. Various combinations of gestures with a different number of gestures in each group are made from the existing facial gestures. Finally, all combinations are trained and classified by a Fuzzy c-means classifier. In conclusion, combinations with the highest recognition accuracy in each group are chosen. An average accuracy >90% of chosen combinations proved their ability to be used as command controllers.Keywords: neural system, neural activity, electromyography, machine learning, muscle activity |
---|---|
ISSN: | 1176-9114 1178-2013 |