Today’s Advanced Driver Assistance Systems (ADAS) are gradually evolving into full autonomous vehicle systems.

The hype and overblown expectations continue. And the auto companies along with the many OEM and technology suppliers are collectively spending billions of dollars to create the first practical self-driving (SD) cars. Good progress is being made. Analyzing this effort to create the first automated vehicles (AVs), it is clear what must happen. The challenges are mostly technical, but there are major legal/social/human issues to resolve.

ADAS as the Prelude to AV

Most of the essential technologies for self-drivers are already at the core of the advanced driver assistance systems (ADAS) now widely available in most vehicles. Typical features include backup cameras, lane departure warning, blind spot detection, adaptive cruise control, automatic braking and a few others. Some ADAS are optional now but the trend is to make it a standard in all vehicles. The ADAS will eventually morph into a real AV. Table 1shows the different levels of automated vehicles as defined by the Society of Automotive Engineers.

LEVELAMOUNT OF AUTOMATION
Level 0 No automation driver performs all functions
Level 1 Driver assistance. Driver performs all functions but ADAS systems provide alerts and partial control or braking, steering and throttle.
Level 2 Partial automation. Driver must still monitor actions but automated systems control braking, steering and throttle
Level 3 Conditional assistance. Automated driving systems perform all driving activities but driver must still be available to take control in special circumstances.
Level 4 High automation. Automated driving systems perform all driving activities. Driver may still control the vehicle if needed.
Level 5 Full automation. No driver needed but a driver may intervene if necessary.

This evolution from ADAS to AVs will occur as these seven critical issues are adequately addressed.

  1. Sensors

A self-driving car is a robot that must have the equivalent sensing ability of a human, or close to it, to function safely. Most of that is “seeing.” The developers are using multiple sensors to create an equivalent composite view suitable for driving. These include:

  • Video cameras. Highly developed, small, cheap and they see color. Multiple RGB cameras are used for different functions. The big problem is that they do not see well at night and weather like fog, snow and hard rain can interfere with their view. Short range is another limitation. Distances to objects are difficult to determine. Furthermore, the complex pictures they capture require lots of memory and special software like machine vision and artificial intelligence to interpret what they see. New vision systems using two cameras in a stereo configuration overcome some of these obstacles.
  • IR cameras. Infrared sensors have been available for eons. They are widely used in weapons and satellites but not in cars. They “see” heat signatures, giving the cameras ever-important night vision. They bring a different and complementary view that adds to the quality of coverage. Standard RGB cameras are also being combined with IR cameras in a collaborative way to make object detection easier and more accurate.
  • Radar. Single-chip radars are now available that give an alternate view of the driving surroundings. Using reflected radio waves they detect objects at a significant distance. The new 77 GHz CMOS devices are cheaper and consume less power. Their field of view and beamwidth can be set from a few degrees up to about 75 deg. by antenna design and selection. And these radars can see out to distances from a few feet up to 300 meters. Multiple devices can be used to fill in the gaps that may remain in the 360-deg. view goal.
  • Ultrasonic. These short-range sensors give an entirely different view. They are typically built into side mirrors for close object detection. Parking assist is another use. Operating at 58 kHz, these sensors are inexpensive and easily incorporated into the vehicle.
  • Lidar. Light detection and ranging sensors are laser-based and radar-like. They paint the surrounding area with narrow laser beams and detect the reflections to generate a picture of the environment. Their advantage is the ability to create a precision 360-deg. 3D image that is superior for object detection. They are not used in current ADAS systems because of their very high cost and there is some doubt about their use in the forthcoming self-drivers. Recent developments show considerable progress.

Read more : http://innovation-destination.com/2018/02/16/7-factors-critical-success-self-driving-cars/