Advanced Driver Assistance Systems Enable Autonomous Vehicles
ADAS is already one of the fastest-growing segments in automotive electronics as it can be considered to serve as the bridge from non-autonomous to fully autonomous vehicles. The closer we get to Level 5, fully autonomous driving, the more robust and comprehensive ADAS will become.
Components Powering ADAS are Advancing
ADAS is commonly understood to include five key components:
- Sensors: Commonly used in some of today's connected devices, some sensors have become significantly lower in cost thanks to innovative manufacturing techniques and the growing volume of usage. When connected to networks or the cloud, sensors yield real-time, actionable data that can power ADAS. However, most of these devices have limited range and bandwidth and must be improved and fused with other sensor data to power a Level 5 Autonomous Driving Experience.
- Processors: With the increasing requirements for processing speed in ADAS applications, processors are used for everything from building a real-time 3D spatial model of a car's surroundings to calculating proximity and threat levels based on the environment. However, due to the length of qualification processes in the automotive industry, the adoption of advanced manufacturing technologies are approximately six years slower than the average smartphone processor.
- Software: Automotive companies increasingly rely on software. After all, the software makes the hardware work. The hardware can be significantly simplified with software, especially when machine learning and artificial intelligence can be implemented to manage different situations. The development effort behind software is huge in the industry.
- Mapping: The ADAS mapping function stores and updates geographical and infrastructure information gathered via vehicle sensors to determine its exact location. This function maintains the information and communicates it to system control even if GPS coverage fails. Since automotive OEMs and other players search for lower-cost options, third-party applications generally meet this demand.
- Actuators: The electrification of the actuator system has been a major enabler of ADAS because it has facilitated interaction with other electrical components in the vehicle. With processors to analyze data from vehicle sensors, the ADAS system can make decisions executable by actuators. This system allows everything from electric power steering to autonomous acceleration and braking.
A Compelling Use Case: Driver Monitoring
Driver and passenger safety is always first in the automotive sectors. With all the major components we discussed earlier, new vehicle interface features have been enabled for ADAS. One example of this is around the next generation in-car sensing technology work eyeSight Technologies is doing regarding driver monitoring, in partnership with Jabil.
Driver distraction and drowsiness are two of the major contributors to accidents and fatalities. With all the enhanced features and entertainment content entering the vehicle, the driver must still focus on driving. These in-car sensing technologies help do that by monitoring whether the driver is distracted or sleepy.
The technology uses machine learning computer vision software to monitor several parameters around the driver’s head and face such as:
- Head pose
- Eyelid position and movements
- Iris and gaze direction
The system can also recognize driver’s faces and estimate their gender and age.
In one use case, if the driver is experiencing a high PERCLOS (the percentage of time the eyelids are closed), the system can take other parameters into account to determine the driver's state of drowsiness. Then, car manufacturers have an opportunity to further personalize a response system that may include rattling the seat or sounding an audible alarm to keep the driver alert and reduce the risk of an accident.