When I first heard of a system that could determine your heart rate without actually touching you, I was skeptical to the point where I dismissed the claim as belonging somewhere between fakery and voodoo. Many moons later I had a reason to look deeper into the techniques required and realized it was not only possible, but had already been achieved, and furthermore had been implemented in the latest version of the Intel® RealSense™ SDK.
It was only when I located and ran the appropriate sample program from the SDK, read the value, and then checked it against my own heart rate by counting the beats in my neck for 15 seconds and multiplying by four, that I realized it actually works! I jumped up and down a little to get my heart rate up, and amazingly, after some seconds, the computer once again accurately calculated my accelerated rate. Of course by this time I was so pumped at the revelation and excited at the prospects of a computer that knows how calm you are, that I could not get my heart rate down below my normal 76 beats per minute to test the lower levels.
2. Why Is This Important
Once you begin your journey into the frontier world of hands-free control systems, 3D scanning, and motion detection, you will eventually find yourself asking what else you can do with the Intel® RealSense™ camera. When you move from large clearly defined systems to the more subtle forms of detection, you enter a realm where computers gain abilities never seen before.
Pulse detection, along with other Intel RealSense SDK features, are much more subtle streams of information that may one day play as critical a role in your daily life as your keyboard or mouse. For example, a keyboard or mouse is no good to you if you’re suffering from RSI (Repetitive Strain Injury) and no amount of clever interfacing will help you if you’re distracted, agitated, sleepy, or simply unhappy. Using the subtle science of reading a user’s physical and possible emotional condition allows the computer to do something about it for the benefit of the user and improve that experience. Let’s say it’s nine thirty in the morning, the calendar shows a full day of work ahead, and the computer detects the user is sleepy and distracted. Using some pre-agreed recipes, the computer could trigger your favorite ‘wake me up with 5 power ballads’ music, flash up your calendar for the next 4 hours and throw some images of freshly brewed coffee on screen as a gentle reminder to switch up a gear.
Technological innovation isn’t always about what button does what or how we can make things quicker, easier, or smarter, it can also be about improving quality of life and enriching an experience. If your day can be made better because your computer has a sense of what you might need and then takes autonomous steps to help you, that can only be a good thing.
By way of another example and not directly related to pulse detection, imagine your computer is able to detect temperature and notices that when you get hot your work rate drops (i.e., less typing, more distracted, etc.) and also records that when the temperature was cooler, your work level increases. Now imagine it recorded sensor metrics about you on a daily basis, and during a particularly hot morning your computer flashes a remark that two days ago you had also been hot, you left the desk for 20 seconds, and 2 minutes later everything was cool (and your subsequent work level improved that day). Such a prompt might recall a memory that you opened a few windows, or turned on the air conditioning in the next room, and so you follow the advice and your day improves. Allowing the computer to collect this kind of data and experimenting with the ways in which this data can improve your own life will ultimately lead to innovations that will improve life for everyone.
Pulse estimation is just one way in which a computer can extract subtle data from the surrounding world, and as technology evolves, the sophistication of pulse detection will lead to readings as accurate as traditional methods.
3. How Is This Even Possible?
My research into precisely how pulse estimation currently works took me on a brief journey through the techniques that have proved to be successful so far, such as detecting what are so called micro-movements in the head.
You need more than a tape measure to detect micro-movements in the head.
Apparently when your heart beats, a large amount of blood is pumped into your head to keep your brain happy, and this produces an involuntary and minuscule movement that can be detected by a high resolution camera. By counting these movements, filtered by normal Doppler and other determinable movements, you can work out how many beats the user is likely to have per minute. Of course, many factors can disrupt this technique such as natural movements that can be mistaken for micro-movements, or capturing shaky footage if you are in transit at the time, or you are simply cold and shivering. Under regulated conditions, this technique has been proven to work with nothing more than a high resolution color camera and software capable of filtering out visual noise and detecting the pulses.
Another technique that is closer to the method used by the Intel RealSense SDK is the detection of color changes in a live stream and using those color changes to determine if a pulse happened. The frame rate does not have to be particularly high for this technique to work, nor does the camera need to be perfectly still, but the lighting conditions need to be ideal for the best results. This alternative technique has a number of variations, each with varying levels of success, two of which I will briefly cover here.
Did you know your eyes can tell you how fast your heart is beating?
Obviously, the technique works better when you are not wearing glasses, and with a high resolution capture of the eyeball you have an increased chance of detecting subtle changes in the blood vessels of the eye over the course of the detection phase. Unlike veins under the skin that are subject to subsurface scattering and other occlusions, the eye offers a relatively clear window into the vascular system of the head. You do have a few hurdles to overcome, such as locking the pixels for the eye, so you only work with the eye area and not the surrounding skin. You also need to detect blinking and track pupils to ensure no noise gets into the sample, and finally you need to run the sample long enough to get a good sense of background noise that needs to be eliminated before you can magnify the remaining color pixels to help in detecting the pulse.
Your mileage will vary as to how long you need to run the sample, and there will be a lot of noise that will mean you have to throw the sample out, but even by running at a modest 30 frames per second you’ll have anywhere from 20-30 samples to find just one pulse (assuming your subject has a heart rate from between 60 to 90 beats per minute).
If you find the color information from the eye is insufficient, such as might occur for users who are sitting a good distance away from the computer, wearing glasses, or meditating, then you need another solution. One more variation on the skin color change method is the use of the IR stream (InfraRed), which is readily provided by the Intel® RealSense™ camera. Unlike color and depth streams, IR streams can be sent to the software at upwards of 300 frames per second, which is quite fast. As suggested before, however, we only need around 30 frames per second of good quality samples to find our elusive pulse, and the IR image you get back from the camera has a special trick to reveal.
Notice the veins in the wrist, made highly visible thanks to Infra-Red
For the purpose of brevity, I will not launch into a detailed description of the properties of IR and its many applications. Suffice it to say that it occupies a specific spectrum of light that the human eye cannot entirely perceive. The upshot is that when we bounce this special light off objects, capture the results, and convert them to something we can see, it reacts a little differently than its neighboring colors higher up the spectrum.
One of the side effects of bouncing IR off a human is that we can detect veins near the surface of the skin and other characteristics such as detecting grease on an otherwise perfectly clean shirt. Given that blood flow is the precise metric we want to measure you might think this approach is perfectly suited to the job of detecting a heart rate. With a little research you will find that IR has indeed been used for the purpose of scanning the human body and detecting the passage of blood around the circulatory system, but only under strict medical conditions. The downside to using IR is that you effectively limit the information you are receiving from the camera and must throw away the equally valuable visible spectrum returned via the regular RGB color stream.
Of course, the ultimate solution is to combine all three sources of information; taking micro-movements, IR blood flow, and full color skin changes to act as a series of checks and balances to reject false positives and produce a reliable pulse reading.
4. How Intel® RealSense™ Technology Detects Your Pulse
Now that you know quite a bit about the science of touchless heart rate detection, we are going to explore how you can add this feature to your own software. You are free to scan the raw data coming from the camera and implement one or all of the above techniques, or thanks to the Intel RealSense SDK you can instead implement your own heart rate detection in just a few lines of code.
The first step is not specifically related to the pulse detection function, but for clarity we will cover it here so you have a complete picture of which interfaces you need and which ones you can ignore for now. We first need to create a PXC session, a
SenseManager pointer, and a
faceModule pointer as we will be using the Face system to eventually detect the heart rate. For a complete version of this source code, the best sample to view and compile against is the Face Tracking example, which contains the code below but with support for additional features such as pose detection.
PXCSession* session = PXCSession_Create();
PXCSenseManager* senseManager = session->CreateSenseManager();
PXCFaceModule* faceModule = senseManager->QueryFace();
Once the housekeeping is done and you have access to the critical
faceModule interface, you can make the pulse-specific function calls, starting with the command to enable the pulse detector.
ActiveConfiguration object encompasses all the configuration you need for the Face system, but the one line that specifically relates to getting a heart rate reading is the function to
Enable(), which activates this part of the system and starts it running.
The final set of commands drills down to the value we are after, and as you can see below relies on parsing through all the faces that may have been detected by the system. It does not assume that a single user is sitting at the computer—someone could be looking over your shoulder or standing in the background. Your software must make additional checks, perhaps using the pose data structure, to determine which is the main head (perhaps the closest) and only use the heart rate for that face/user. Below is the code that makes no such distinction and simply moves through all the faces detected and takes the heart rate for each one, though it does nothing with the value in this example.
PXCFaceData* faceOutput = faceModule->CreateOutput();
const int numFaces = faceOutput->QueryNumberOfDetectedFaces();
for (int i = 0; i < numFaces; ++i)
PXCFaceData::Face* trackedFace = faceOutput->QueryFaceByIndex(i);
const PXCFaceData::PulseData* pulse = trackedFace->QueryPulse();
if (pulse != NULL)
pxcF32 hr = pulse->QueryHeartRate();
You can ignore most of the code except for the
QueryPulse() which asks the system to work out the latest heart rate from the data collected thus far, and if data is available, to use the pulse->
QueryHeartRate() to interrogate that data and return the heart rate in beats per minute.
An expression of surprise as the pulse estimate was exactly right.
By running the Face Tracking sample included with the Intel RealSense SDK and deselecting everything from the right except detection and pulse, then pressing start, you will be greeted with your own heart rate after 10 seconds of staying relatively still.
Once you have stripped out the non-pulse code from the above example, you can use it as a good code base for further experiments with the technique. Perhaps drawing a graph of the readings over time, or adding code to have the app run in the background and produce an audible beep to let you know when you’re getting too relaxed or excited. More seriously, you can monitor the accuracy and resolution of the readings returned to determine if they are sufficient for your application.
5. Tricks and Tips
- For best results not only when detecting your heart rate but for all capture work, use the camera in good lighting conditions (not exposed to sunlight) and stay relatively still during the sampling phase until you get an accurate reading.
- As the current SDK only provides a single function for the detection of pulse, the door is wide open for innovators to use the range of raw data to obtain more accurate and instant readings from the user. The present heart rate estimate takes over 10 seconds to calculate, can you write one that performs the measurement in less time?
- If you want to perform heart rate estimation outdoors and want to write your own algorithm to perform the analysis, it is recommended you use the color stream only for detecting skin color changes.
- Don’t try to detect a heart rate with all the options in FaceTracking activated as this will reduce the quality of the result or fail to report a value altogether. You will need sufficient processing power available for the Face module to accurately estimate the heart rate.
- Don’t use an IR detection technique in outdoor spaces as any amount of direct sun light will completely obliterate the IR signals returned, rendering any analysis impossible.
As touched on at the start of this article, the benefits of heart rate detection are not immediately apparent when compared to the benefits of hands-free controls and 3D scanning, but when combined with other sensory information can provide incalculable help to the user when they need it most. We’re not yet at the stage where computers can record our heart rate simply by walking past the doctor’s office window, but we’re half way there and it’s only a matter of time and further innovation and application before we see it take its place in our modern world.
From a personal perspective, living the life of an overworked, old-school, code-crunching relic, my health and general work ethic are more important to me now than in my youth, and I am happy to get help from any quarter that provides it. If that help takes the form of a computer nagging me to ‘wake up, drink coffee, eyes front, don’t forget, open a window, take a break, eat some food, go for a walk, play a game, go to the doctor you don’t have a pulse, listen to Mozart, and go to sleep’— especially if it’s a nice computer voice—then so be it.
Of course being a computer, if the nagging gets a little persistent you can always switch it off. Something tells me though that we’ll come to appreciate these gentle reminders, knowing that behind all the cold logic, computers are only doing what we asked them to do, and at the end of the day, we can all use a little help, even old-school, code-crunching relics.
About The Author
When not writing articles, Lee Bamber is the CEO of The Game Creators (http://www.thegamecreators.com), a British company that specializes in the development and distribution of game creation tools. Established in 1999, the company and surrounding community of game makers are responsible for many popular brands including Dark Basic, The 3D Game Maker, FPS Creator, App Game Kit (AGK) and most recently, Game Guru.