Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications
Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been te...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9078767/ |
id |
doaj-624f0ebeb0c1485484a6293e80ff1d85 |
---|---|
record_format |
Article |
spelling |
doaj-624f0ebeb0c1485484a6293e80ff1d852021-03-30T01:39:15ZengIEEEIEEE Access2169-35362020-01-018773447736310.1109/ACCESS.2020.29903339078767Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World ApplicationsAlejandro Cartas0https://orcid.org/0000-0002-4440-9954Petia Radeva1Mariella Dimiccoli2Mathematics and Computer Science Department, University of Barcelona, Barcelona, SpainMathematics and Computer Science Department, University of Barcelona, Barcelona, SpainInstitut de Robòtica i Informàtica Industrial, CSIC-UPC, Barcelona, SpainActivity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.https://ieeexplore.ieee.org/document/9078767/Daily activity recognitionvisual lifelogsdomain adaptationwearable cameras |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Alejandro Cartas Petia Radeva Mariella Dimiccoli |
spellingShingle |
Alejandro Cartas Petia Radeva Mariella Dimiccoli Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications IEEE Access Daily activity recognition visual lifelogs domain adaptation wearable cameras |
author_facet |
Alejandro Cartas Petia Radeva Mariella Dimiccoli |
author_sort |
Alejandro Cartas |
title |
Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_short |
Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_full |
Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_fullStr |
Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_full_unstemmed |
Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_sort |
activities of daily living monitoring via a wearable camera: toward real-world applications |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications. |
topic |
Daily activity recognition visual lifelogs domain adaptation wearable cameras |
url |
https://ieeexplore.ieee.org/document/9078767/ |
work_keys_str_mv |
AT alejandrocartas activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications AT petiaradeva activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications AT marielladimiccoli activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications |
_version_ |
1724186666455793664 |