Summary: | Daily vital signs monitoring is very important for detecting diseases in early stages and for preventive treatments. Such a task can be achieved by taking advantage of the omnipresence of cameras in people's personal space. As heart-related diseases are part of the leading causes of deaths worldwide, monitoring heart-related vital signs appear to be very crucial. In this article we aim to provide a touchless approach and propose a robust method for estimating heart rate through analysis of face videos. In particular, we consider a challenging scenario, i.e., the user is on a video call and may often move his/her head. Existing touchless, vision-based methods use either photoplethysmography (PPG) or ballistocardiography (BCG). PPG methods exploit color changes in human skin during heartbeats caused by blood volume variations, but this is very sensitive to unstable lighting conditions. On the other hand, BCG methods exploit subtle head motions caused by Newtonian reaction to blood influx into the head at each heartbeat, thus being sensitive to a subject's voluntary head movements. Unlike conventional studies where either a PPG method or a BCG method is used, we propose to combine both to overcome the weakness faced by each method. We use BCG methods as the main approach due to their better accuracy on heart rate estimation, and PPG methods are used as the secondary backup to improve the accuracy in cases of large and frequent voluntary head movements. To this end, we introduce a dynamic voting system that effectively combines results of several variants of PPG and BCG methods. Experiments conducted on 20 healthy subjects with different skin tones in different lighting conditions show that our method has better accuracy compared to state-of-the-art methods, well addressing large voluntary head movements. Our method had a mean absolute error of 1.23 beats per minute (BPM) in the cases without voluntary head movements and 2.78 BPM in the cases with voluntary head movements.
|