Sensor fusion is ushering a new era in healthcare services as technology expands its capacity further. Sensors are now utilized in a wide range of applications such as smart phones, automotive systems, climate monitoring, industrial systems and healthcare.
Sensors have become less expensive and miniaturized; so these days, it is common to see an electronic product equipped with sensors that improve users’ experiences. Most importantly, it is now starting to closely mimic the ultimate sensing machine— the human being.
The technology that makes this possible to happen is sensor fusion— it leverages a microcontroller or the “brain” to fuse individual data acquired from multiple sensors to get a more accurate and credible view of the data than one would get by using the data from each discrete sensor on its own. For sensors, it is absolutely more effective to get the bigger picture of information from all combined rather than individually.
What is sensor fusion?
Sensor fusion is the accumulation of data from multiple sensors to gain a more accurate picture of the sensors’ subject or environment. The combination of two or more sensors produces extremely rich data that eliminates the limitations in range or accuracy of the individual sensors.
Software or applications that aggregate results from multiple sources yield insights much faster and allows for more sophisticated analysis than when data from each sensor had to be processed separately. The algorithms used to fuse the combined data are commonly found in mobile devices and health monitoring devices.
Human Beings – the best sensing example
Let’s look at how sensing works in the human body to better understand sensor fusion. Our five senses— vision, hearing, smell, taste and touch—all provide sensory information about our environment or surroundings, which travel through the peripheral nervous system (PNS) to the brain. The brain then reacts to a given condition or experience.
The human brain, as the ultimate decision maker, responds to sensory input and sends out motor information—our physical response to the input. So, the PNS doesn’t make complex decisions about the information it transports. For instance, a pedestrian sees a car speeding up toward him, and his brain orders his muscles to walk faster to the safe side of the road to avoid an accident.
However, without the peripheral nervous system’s ability to bring in sensory information and send out motor information, we would not be able to do react, talk, walk or do other actions as we often do spontaneously. The human brain has a function to utilize sources of sensory information to validate an event or compensate for the lack of “complete information” to make a sound decision. For instance, you may not see the fire under the hood of your car, but the smell of burning rubber and the heat you feel coming from the dash would tell your brain that it is time to leave the car because the engine is on fire. Such information helps the brain to make decisions.
A similar principle is applied to technology. A more accurate and faster sensing is created by integrating inputs from multiple sensors. Sensor fusion is able to produce higher levels of recognition and allow these technologies to respond properly.
Leveraging Sensor Fusion in Healthcare and Remote Patient Monitoring
Today, a variety of wearable sensors are being used for ambulatory monitoring of gait— the movement pattern of the limbs during location over a solid ground. Sensors such as ECG, EMG, thermometers, gyroscopes, accelerometers, magnetometers, gyroscopes and vibration sensors are being used in some wearable devices in clinical environment for patient gait monitoring and rehabilitation.
Moreover, new classes of wearable devices will deliver these functions and more for everyday use. Advances in MEMS system miniaturization, compact yet very effective but energy-efficient MCUs, and low-power connectivity technology have made it possible for a new category of wearable consumer medical devices to concentrate on monitoring personal health patterns instead of waiting for chronic diseases to show severe symptoms.
The potential for wearables has also given rise to a trend called self-quantification, where individuals integrate technology into the collection of data on certain aspects of a patient’s everyday life. Various mental and physical activities, inputs to the body such as food or air quality, and other variables such as blood sugar and oxygen levels, blood pressure, heart rate, body temperature, appetite, and other mood-related information include the different inputs to the person being quantified at the moment, before, and after different stimuli to the person. This is like an automated self-monitor that incorporates wearable sensors and wearable computing to fuse the variety of data and to add meaning to create an assessment that is far larger than the individual readings.
RELATED ARTICLE: The Internet of Medical Things (IoMT)
In telehealth and telemedicine, the continuous monitoring of the patient-generated health data and the fusion of the different variables can lead to quicker and better management of chronic diseases, compared to a doctor’s visit every three to six months.
Patients are now empowered to track their vital signs at home or anywhere else. With the power of sensor fusion technology, patients can see a more reliable data— the whole picture of their health condition— in real-time. Since patients are equipped with medical devices, RPM solutions with continuous real-time data streaming helps them gain knowledge concerning their condition(s) as they monitor the data, resulting in better engagement and optimized health. When patients are actively involved and engaged with their care management, they are more likely to adhere to medical interventions.
DrKumo provides sensor fusion-equipped medical devices and cloud-based remote monitoring services that enable healthcare providers and patients to obtain holistic and continuous biometric readings, enhancing their efficiency and optimizing patient lives.
- King RC; Villeneuve E; White RJ; Sherratt RS; Holderbaum W; Harwin WS; (2017). Application of data fusion techniques and technologies for Wearable Health Monitoring. Medical engineering & physics. https://pubmed.ncbi.nlm.nih.gov/28237714/