Sensor Fusion and Data Fusion for ADAS and driverless vehicle
The French High-Tech startup NEXYAD recently presented their module SafetyNex.
SafetyNex is a real time driving risk assessment system. Of course, Driving Risk makes everyone think of car insurance and fleet management. And it is a natural application (deployment has already started). But it is important to note that Driving Risk is also a key notion for ADAS and Driverless car.
Indeed, Driving Risk happens when there is no adequation between Driving Behaviour and Diving Context. ADAS and Driverless act on Driving Behaviour :
. ADAS modifies Driving Behaviour : braking when the human driver did not, etc ...
. Driverless car creates Driving Behaviour : there is still a driver calles "artificial intelligence".
Driving context is measured :
. Map Electronic Horizon
. Times to collision (front and rear)
. Number of vulnerables around (even on sidewalks)
. Atmospheric visibility / weather condition (fog, pourring rain, ...)
. X2Car Data Streams (accident, weather alert, construction area, ...)
So you can now imagine that if you have the opportunity to ESTIMATE adequation between Driving Behaviour and Driving Context, then you can build much more relevant ADAS and Driverless Aritificial Intelligence (adequation or inadequation).
You may notice that Driving Context is measured through heterogenous sensors and data streams. It brings no difficulty to SafetyNex that uses Fuzzy Sets and Possibility Theory to estimate adequation, givin a Driving output called Driving Risk (that you should want to minimize Under constraints of mobility efficiency).
Then SafetyNex is actually a sensor and data fusion system, much more efficient than every fusion systems that you ever developed, because it generates a variable (Driving Risk) that is a KEY NOTION for driving and is EASY TO UNDERSTAND AND USE.
Read more : http://www.safetynex.net