Key Trend: Sensor Fusion to create Sensor as a platform for delivering solutions

Sensor fusion is a process of combining the data from different sensors. Then a micro controller (the brain) uses these data to more accurately predict the outcome by combining other contextual awareness data sources. Multiple data sources help to remove errors from data using multiple data cross points and combining contextual data makes it for useful rather than data from only a single sensors source. Sensor fusion creates a situation in which the whole is much greater than the sum of its parts.

Sensor fusion platforms are gaining popularity and being adopted by different OEMs. Historically sensors were often designed for a single purpose – to gather and stream data related to a single function. However, with multiple onboard sensors – designers can combine outputs from multiple sensors to more accurately predict the situation. For example, combining data from the accelerometer, compass, gyroscope and pressure sensor to accurately predict the device’s position in three dimensions.

3D motion sensing using Sensor Fusion

Source: kionix

AR/VR Headsets:

Tracking head position with little error was a challenge for many AR/VR developers. Measurement with errors can create bad orientation and drift effect in the virtual world. These types of errors make AR/VR headsets unsuitable for longer periods of usage and can lead to motion sickness for users.

Therefore, OEMs are using multiple sensors like Gyroscope and Accelerometer and combining the data from multiple sensors to more accurately predict orientation in three dimensions.

Source: Oculus

Autonomous Car:

Autonomous cars use multiple cameras for visual inputs and depth sensors for distance. To make it more accurate in low light and other dynamic conditions, designers combined those sensors’ data with computational sensing algorithm for better understanding of the physical world around the vehicle.

Source: Here

Amazon introduced unconventional ‘Amazon Go’ retail store in Dec 2016. Right now, it is under beta testing. Amazon created a system that automatically logged items in the customer’s cart when a product is picked up from a shelf. Payment was also processed automatically from the customer’s Amazon account without going through a traditional checkout counter while exiting the store.

This system uses multiple sensor technologies like scale, pressure sensor, load cell in combination with image processing and machine learning techniques to identify different items and quantities. Tesco also started trialing a similar conceptual store in October 2017.

Source: YouTube-Amazon Go Store

Indoor Navigation:

Indoor navigation is another area where sensor fusion technology is in use. Calculating data from different sensors like pedometer, accelerometer, gyroscope and magnetometer, can be used to more accurately determine indoor location and help users navigate unfamiliar buildings such as large shopping malls.

Under an ideal IoT ecosystem, interaction between human machine and infrastructure needs to be in sync with contextual awareness. With constant streams of data, it also needs to be error-free. Combining multiple sets of data from different sensors makes it less error prone and better able to predict situations.

For example: Multiple sources of data from GPS, Camera, Infrared, RFID etc. could be used to accurately identify and automate fulfillment of trucks during a delivery process, saving time and cost.

Using sensor technology to track our health is nothing new. But we will derive much more value using sensor fusion platforms. Sensor fusion platforms will be powerful for medical applications to detect muscle contraction, sweat or body temperature or heart rate variation etc. With embedded digital sensors, which can then combine with data from medical history to accurately identify diseases. This can help reduce the cost of medical services through early detection of diseases by identifying abnormal patterns and predicting possible outcomes

Source: Aspire Ventures

Sensor Fusion Takeaways

  • With reduction of sensors cost, more OEMs will adopt multiple sensors. Sensor fusion technology along with improvement in machine learning algorithms will make devices smarter and more contextually aware.
  • With advances of sensor fusion technology platform, it will likely drive the emergence of new software industry. Which will use sensors as a platform to deliver solutions specific to industry needs
  • It is likely that Bill of material (BoM) cost for sensor will be much less and solution providers will have headed for business with multiple revenue models. This can include subscription models with complete software solutions. We will likely see new innovative business offerings from the solution providers.
  • Big players with hardware and software expertise will succeed in this process. We will likely see lots of mergers and acquisitions taking place. While big players will find small industry experts to fill the gaps in their portfolios. However, some small players with specific patented technology and industry expertise might be able to survive this cannibalization.


Sensors in Smartphones to Top 10 Billion Unit Shipments in 2020