News

Updates and analysis on technology, impact investing, and our venture companies.

Next Page Navigation Link
Previous Page Navigation Link

Everywhere we look, sensors surround us, collecting important data and interfacing our digital world with our everyday lives. There is a whole ecosystem of sensor technology already in place, but are we using that technology to its fullest potential?

In 2016, the sensor market was valued at $123.5 billion and is expected to increase to nearly $240.3 billion by 2022. With constant advancements in technology and, consequently, the number of sensors, the data they are producing is becoming increasingly more valuable in understanding how our world works.

Sensors are usually designed for a single purpose: to gather and stream one type of data related to a single function. But overlapping these data streams could lead to an entirely new level of complex understanding. In the same way that our brains fuse all of our senses to create a more complete perceptual realization of the world around us, combining multiple sensor technologies could enable much deeper, more useful insight.

To achieve that level of insight, we need tools to piece together all of the data – we need artificial intelligence. With the use of AI, we can weave together data streams from multiple inputs to create a comprehensive view. This combination of sensor data is called sensor fusion.

Sensor fusion is already powering some of our most revolutionizing technologies. Smartphones combine gyroscopes, accelerometers, and compasses for all kinds of useful applications; and self-driving cars will rely on sensor fusion to navigate busy, unpredictable streets.

But what if we could use sensor fusion to accomplish something bigger? What if we could use it to change healthcare and live healthier lives?

In the healthcare industry, sensors play an important role in monitoring health data in real time. And to some extent, we’re already seeing some forms of simple sensor fusion. But integrating an advanced form of sensor fusion that combines a larger network of sensors could completely transform healthcare as we know it.

That’s the vision we’re working on here at Aspire Ventures. We’re developing a healthcare platform, Connexion, that fuses a large array of auditory, optical, and pressure sensors in an AI-powered healthcare kiosk that rapidly produces deep insights into the body. Users will be able to use a number of health applications within the kiosk—from automated musculoskeletal assessments, to skin cancer screenings—independently between their regular doctor visits.

A combination of multiple high-speed cameras and a Kinect sensor work together to analyze posture, lateral balance, and body movement for subtle musculoskeletal irregularities; the cameras also run through advanced algorithms to enable facial recognition, mood analysis, and even remote heart rate detection; and a pressure sensitive mat measures body weight, the arch of your feet, and lateral balance.

That’s what we’ve developed so far, but there are vast possibilities for powerful new healthcare applications. By leveraging our adaptive artificial intelligence platform, A2I, Connexion cameras could remotely measure respiration rate and pulse oxygenation; a microphone could take the place of a stethoscope, using data from breathing patterns to infer internal issues such as valve dysfunction; and users could sync their wearable data with the Connexion for a deeper health analysis that shows the body’s changes over time.

We recently debuted our first Connexion prototype to professional trainers and doctors during the NBA Combine in Chicago in partnership with Fusionetics, a leading performance health technology company that works with professional sports teams across the country. In Chicago we demonstrated the first functional app for Connexion: an automated movement efficiency assessment that analyzes the user’s movements during a set of exercises and recommends personalized training programs for remediation, injury prevention, and improved body control.

“The Kinect data, video, and pressure mat data are 'fused' in a compensation detection algorithm to determine issues that could lead to poor performance or injury, like arm drop, excessive back arch, or heel lift,” says Peter Funke, Scientist at Aspire Ventures. That’s one piece of the performance health puzzle. We’re also working with Fusionetics on a number of other applications that will let athletes test strength, flexibility, and balance.

The Chicago unveiling has given us a promising outlook for the future of the Connexion and its impact on sports science, but it will have much broader applications in healthcare. By fusing all the sensory technology available to us, we can change the way we assess and manage our health—letting patients do everything from automated skin cancer detection to guided physical therapy sessions. It will also change the way we interact with healthcare systems. The Connexion experience gives patients a convenient point of access to care, increasing their engagement and allowing patients to take more control of their health and wellness. It also gives doctors far deeper insight into their patients that will aid diagnosis and treatment.  

Using sensor technology to track our health is nothing new. But we can get so much more value from the technology that we already have. With AI-powered sensor fusion, we can use existing technology to change the way we see and understand our bodies to live healthier, happier lives.