Towards Robust Imaging Photoplethysmography in Unconstrained Settings

Date
2021-04-28
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

As the blood flows through the skin, the varying blood concentration changes the color of the skin slightly over time, allowing cameras to monitor vital signs. While there has been rapid progress in the technology and the algorithms for camera-based physiology, most existing methods were only validated in unrealistic and controlled laboratory settings. In this thesis, we present three kinds of approaches to translate this technology from the laboratory to diverse real settings. First, if we know the application of interest and the specific corruption that we expect to see, we can develop hardware and algorithmic solutions to address those sources of corruption. We present a joint hardware and software solution to reduce illumination variations during driving. We also show that we can train deep learning models to overcome video compression artifacts for a telemedicine application. Second, we present a denoising approach using Inverse Convolutional Attention Networks that improves the quality of the physiological signals in presence of diverse and unknown sources of corruption. Third, we propose a novel data augmentation approach to address the problems of overfitting and to improve the cross-dataset generalizability of deep learning models trained on small and limited datasets.

Description
Degree
Doctor of Philosophy
Type
Thesis
Keywords
imaging photoplethysmography, deep learning, computer vision, computational imaging
Citation

Nowara, Ewa Magdalena. "Towards Robust Imaging Photoplethysmography in Unconstrained Settings." (2021) Diss., Rice University. https://hdl.handle.net/1911/110424.

Has part(s)
Forms part of
Published Version
Rights
Copyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.
Link to license
Citable link to this page