Scalable Health Labs

Towards Bio-Behavioral Medicine


Robust non-contact vital signs monitoring using a camera

Mayank Kumar, Ashok Veeraraghavan, and Ashutosh Sabharwal, DistancePPG: Robust non-contact vital signs monitoring using a camera. Biomedical Optics Express, Vol. 6 Issue 4, pp.1407-1418 (2015) PDF (Patent Pending)


Mayank Kumar, Graduate Student, ECE
Dr. Ashok Veeraraghavan, Associate Professor, ECE
Dr. Ashutosh Sabharwal, Professor, ECE


distancePPG algorithm powers the Rice CameraVitals, which has won many accolades (see News for keyword CameraVitals) and has been covered by multiple news outlets.

Nasa Tech BriefsThe Daily MailMedGadgetTexas InstrumentsSPIE OpticsThe Economic Times


Overview: A Robust non-contact vital signs monitoring using a camera

THE PROBLEM: Measuring and monitoring any patient’s vital signs is essential for their care. In fact, all care first begins by collecting vital signs like heart rate. The current standard of care is based on monitoring devices that require contact – electrocardiograms, pulse oximeter, blood pressure cuffs, and chest straps. However, contact-based methods have serious limitations for monitoring vital signs of newborns, as they have extremely sensitive skin and most contact-based vital sign monitoring techniques result in skin abrasions, peeling, and damage every time the leads or patches are removed. This could potentially result in infection, increasing the mortality risk to the neonates.
OUR SOLUTION: We propose to use a normal camera to measure the vital signs of a patient by simply recording video of their face in a non-contact manner. From the recorded video of the face, our algorithm, distancePPG, extracts pulse rate (PR), pulse rate variability (PRV) and breathing rate (BR). The algorithm is based on estimating tiny changes in skin color due to changes in blood volume underneath the skin surface. These changes are invisible to the naked eye, but not to the camera.

Our algorithm, distancePPG (patent pending), achieves clinical-grade accuracy for all skin tones, under low light conditions and can account for natural motion of subjects. It does so by intelligently combining skin color change signal from different regions of the visible skin in a manner that improves the overall signal strength. Our algorithm results in as much as 6dB of SNR improvement in harsh scenarios, rapidly expanding the scope, viability, reach and utility of CameraVitals as a replacement for traditional contact-based vital sign monitor.


Four basic steps in cameraVitals: Step (i) Extract landmark points such as eyes, nose, mouth and face boundary from face image, Step (ii) Face is divided into seven regions, each region tracked over the video using computer vision tracker, Step (iii) Each tracked region is further divided into small regions of interest (ROI), Step (iv) DistancePPG computes the goodness metric associated with each ROI based only on the video recordings, and estimate camera-based PPG signal with much higher SNR (signal to noise ratio). For more details, please read our paper.


June 2015: Texas Instruments highlights the Camera Vitals project on their Think.Innovate Blog.

February 2015: DistancePPG paper accepted in Biomedical Optics express

October 2014: DistancePPG (v 2.0) tested on subjects under different motion scenarios.

August 2014: DistancePPG version 2.0 with improved motion handling capability developed.

April 2014: DistancePPG (v 1.0) received first prize in the Graduate-poster competition at Rice ECE Corporate Affiliates Day.

March 2014: DistancePPG version 1.0 algorithm tested on first 12 subjects having different skin tones, 2 subjects under different lighting conditions

distancePPG: Skin tone from Mayank Kumar on Vimeo.

distancePPG: motion from Mayank Kumar on Vimeo.

distancePPG: light from Mayank Kumar on Vimeo.