Deep Human Activity Recognition With Localisation of Wearable Sensors
Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be us...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9170502/ |
id |
doaj-5f559eccea394fc8bc7dde2544c5e372 |
---|---|
record_format |
Article |
spelling |
doaj-5f559eccea394fc8bc7dde2544c5e3722021-03-30T04:05:34ZengIEEEIEEE Access2169-35362020-01-01815506015507010.1109/ACCESS.2020.30176819170502Deep Human Activity Recognition With Localisation of Wearable SensorsIsah A. Lawal0https://orcid.org/0000-0002-3108-5997Sophia Bano1https://orcid.org/0000-0003-1329-4565Faculty of Applied Computing and Technology (FACT), Noroff University College, Kristiansand, NorwayWellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London, London, U.K.Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This article addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication.https://ieeexplore.ieee.org/document/9170502/Human activity recognitiondeep learningsensor localisationwearable sensors |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Isah A. Lawal Sophia Bano |
spellingShingle |
Isah A. Lawal Sophia Bano Deep Human Activity Recognition With Localisation of Wearable Sensors IEEE Access Human activity recognition deep learning sensor localisation wearable sensors |
author_facet |
Isah A. Lawal Sophia Bano |
author_sort |
Isah A. Lawal |
title |
Deep Human Activity Recognition With Localisation of Wearable Sensors |
title_short |
Deep Human Activity Recognition With Localisation of Wearable Sensors |
title_full |
Deep Human Activity Recognition With Localisation of Wearable Sensors |
title_fullStr |
Deep Human Activity Recognition With Localisation of Wearable Sensors |
title_full_unstemmed |
Deep Human Activity Recognition With Localisation of Wearable Sensors |
title_sort |
deep human activity recognition with localisation of wearable sensors |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Automatic recognition of human activities using wearable sensors remains a challenging problem due to high variability in inter-person gait and movements. Moreover, finding the best on-body location for a wearable sensor is also critical though it provides valuable context information that can be used for accurate recognition. This article addresses the problem of classifying motion signals generated by multiple wearable sensors for the recognition of human activity and localisation of the wearable sensors. Unlike existing methods that used the raw accelerometer and gyroscope signals for extracting time and frequency-based features for activity inference, we propose to create frequency images for the raw signals and show this representation to be more robust. The frequency image sequences are generated from the accelerometer and gyroscope signals from seven different body parts. These frequency images serve as the input to our proposed two-stream Convolutional Neural Networks (CNN) for predicting the human activity and the location of the sensor generating the activity signal. We show that the complementary information collected by both accelerometer and gyroscope sensors can be leveraged to develop an effective classifier that can accurately predict the performed human activity. We evaluate the performance of the proposed method using the cross-subjects approach and show that it achieves an impressive F1-score of 0.90 on a publicly available real-world human activity dataset. This performance is superior to that reported by another state-of-the-art method on the same dataset. Moreover, we also experimented with the datasets from different body locations to predict the best position for the underlying task. We show that shin and waist are the best places on the body for placing sensors and this could help other researchers to collect higher quality activity data. We plan to publicly release the generated frequency images from all sensor positions and activities and our implementation code with the publication. |
topic |
Human activity recognition deep learning sensor localisation wearable sensors |
url |
https://ieeexplore.ieee.org/document/9170502/ |
work_keys_str_mv |
AT isahalawal deephumanactivityrecognitionwithlocalisationofwearablesensors AT sophiabano deephumanactivityrecognitionwithlocalisationofwearablesensors |
_version_ |
1724182410523836416 |