loading page

Cost-effective Mobile Solution for Autonomous and Continuous Vital Signs Monitoring
  • +5
  • Hen-Wei Huang ,
  • Jack Chen ,
  • Philipp Rupp ,
  • Claas Ehmke ,
  • Peter Chai ,
  • riya dhar ,
  • Ian Ballinger ,
  • Giovanni Traverso
Hen-Wei Huang
Author Profile
Jack Chen
Author Profile
Philipp Rupp
Author Profile
Claas Ehmke
Author Profile
Peter Chai
Author Profile
riya dhar
Author Profile
Ian Ballinger
Author Profile
Giovanni Traverso
Massachusetts Institute of Technology

Corresponding Author:[email protected]

Author Profile

Abstract

The Covid-19 Pandemic has renewed interest in contactless vital signs monitoring using state-of-the-art computer vision, which can efficiently screen for symptoms while reducing the risk of disease transmission. Despite the promising perfor- mance, the use of static camera setups requires subjects to remain static inside a field of view (FoV) for a pre-specified duration. Due to inconsistent ambient environmental conditions, the transit of individuals through the FoV, and the time it may take to triage individuals, the widespread adoption of static camera systems to continuously monitor vital signs has had suboptimal uptake. Robotic systems enable autonomous and continuous monitoring, but these require expensive cameras, computers, and robotic platforms, limiting widespread deployment. In response, we propose a cost-effective and scalable robotic solution consisting of a suite of commercial, off-the-shelf wireless cameras for capturing photoplethysmography (PPG) on ambulatory subjects linked to a single computer that supervises the cameras to compute the vital signs of subjects. Throughout a set of careful investigations of each individual step of the wireless machine vision camera and computer, bottlenecks constraining wireless live-streaming of high-quality PPG information are identified and those are addressed by a hybrid centralized/decentralized wireless machine vision protocol. Our results demonstrate that the proposed cost-effective wireless camera achieves equivalent remote-PPG accuracy to its costly, USB3 counterparts (mean error: 5.0 BMP vs. 4.7 BPM) by means of the hybrid camera protocol which boosts the overall frame rate to 17 FPS. In contrast, using the standard method that captures the PPG with the same spatial resolution can only achieve 1 FPS. In addition, this work also elucidates how varying the distance, image pixel density, frame rate, image compression, image downsampling, and color depth affect the rPPG performance. For each of the effects, we also discuss potential solutions for the cost-effective setup.