All News

“Towards Camera-Based HRV Estimation in the Car” published in the IEEE IoT Journal (IF 9.47)

Monitoring heart activity has been a vision of many automobile manufacturers and researchers, as it is not only related to the health states of drivers but also their driving safety. How can an AIoT solution help bring us more closer to this goal?

By analysing the data collected from our project ARNE, we discovered that the outliers of heart activities can be reliably inferred from a driver’s facial expression via machine learning models because both heart activities and facial expressions are interconnected by the autonomic nervous system. As such, our machine learning solution facilitates driving safety while at the same time providing well-being monitoring related features. This technique has the potential to become prevalent because driver facial cameras will very likely be a mandatory component in future driver monitoring systems. For more details, please refer to the publication:

Liu, S., Koch, K., Zhou, Z., Maritsch, M., He, X., Fleisch, E., Wortmann, F., Towards Non-Intrusive Camera-Based Heart Rate Variability Estimation in the Car under Naturalistic Condition, IEEE Internet of Things Journal, 10.1109/JIOT.2021.3131742. [PDF]

Abstract
Driver status monitoring systems are a vital component of smart cars in the future, especially in the era when an increasing amount of time is spent in the vehicle. The heart rate (HR) is one of the most important physiological signals of driver status. To infer HR of drivers, the mainstream of existing research focused on capturing subtle heartbeat-induced vibration of the torso or leveraged photoplethysmography (PPG) that detects cardiac cycle-related blood volume changes in the microvascular. However, existing approaches rely on dedicated sensors that are expensive and cumbersome to be integrated or are vulnerable to ambient noise. Moreover, their performance on the detection of HR does not guarantee a reliable computation of heart rate variability (HRV) measure, which is a more applicable metric for inferring mental and physiological status. The accurate computation HRV measure is based on the precise measurement of the beat-to-beat interval, which can only be accomplished by medical-grade devices that attach electrodes to the body. Considering these existing challenges, we proposed a facial expression based HRV estimation solution. The rationale is to establish a link between facial expression and heartbeat since both are controlled by the autonomic nervous system. To solve this problem, we developed a tree-based probabilistic fusion neural network approach, which significantly improved HRV estimation performance compared to conventional random forest or neural network methods and the measurements from smartwatches. The proposed solution relies only on a commodity camera with a light-weighted algorithm, facilitating its ubiquitous deployment in current and future vehicles. Our experiments are based on 3,400 km of driving data from nine drivers collected in a naturalistic field study.

2021-12-07T09:33:27+01:00December 6th, 2021|