We all know that 1 + 1 = 2. But is it ever possible for 1 + 1 to = 3? Or 5? Or 50?

At its core, this is what sensor fusion is all about: creating greater total value through use of data from multiple sensors in conjunction with one another. When used in such a way, sensor fusion creates a richer, more accurate, and more useful picture of reality than simply using the data from the same sensors in a standalone fashion.

What does this actually look like in practice? Let’s take a widely-used application of Ainstein’s advanced radar technology: autopilot and collision avoidance for UAVs (drones).

For drones to have any practical use, their human pilots need to be able to know where the drone is at all times (including elevation above ground). It’s also often useful or even critical to know the drone’s current velocity. To provide this information, most drone platforms include an IMU (inertial measurement unit) and a GPS sensor. Now with Ainstein’s radar altimeters, your drones would have highly reliable and stable above the ground altitude reading. Along with Ainstein’s collision avoidance radar sensors, these sensors generate the 3D whereabout information for your drones. If the sensor is used alone, it provides a less-than-complete picture of where the drone is at any given moment. However, when data from the multiple separate sensors is taken and analyzed together, they provide a strikingly accurate and precise picture of where the drone is and how fast it’s moving. 1 + 1 = 50!

We can take the above example a bit deeper, and look at what happens when an additional sensor — say, a hyperspectral camera used for crop monitoring — is added to the drone. When used for precision agriculture purposes, drones fly over fields of crops and take readings of the crops’ status. Oftentimes, this is done with a hyperspectral camera, which allows the drone to see conditions in the crops that aren’t visible to the human eye, by examining different wavelengths of light. By so doing, the drone can detect possible defects or anomalies in the crops, such as a current or future crop disease outbreaks, overwatering or underwatering, and lack of fertilizer. However, these readings are only useful if they are paired with data about when and where the readings were taken — this is sensor fusion.

Have you wished your snow mower can clean your driveway by itself? Now with camera and radar sensor fusion, they can! The inputs from camera and radar can cross-check each other for super accuracy, and it recognizes your neighbor is walking by and stops itself for her to pass.  

Ainstein’s world-class team of engineers have expertise in developing products that allow for deep sensor fusion, with over 100 I/Os (input/output). Just think of all the possibilities of what you can do with that many I/Os! Interested in seeing how Ainstein’s technology can be used for your own sensor fusion project? Need help on a custom development sensor fusion project? Drop us a line at hi@ainstein.ai, and we’ll be in touch.