Sensors for autonomous vehicles: competition or collaboration?

The development of autonomous vehicles has accelerated in recent years. Waymo’s fleet, or Audi’s moves since it released its A8 model, show the great interest in this topic. But the term ‘autonomous vehicle’ includes two categories. One is robotic cars embedding many sensors like cameras, LiDAR and radar for complete autonomy. The other is Advanced Driver Assistance System (ADAS) vehicles, which only embed a limited number of these sensors for partial autonomy. Therefore, Yole Développement’s (Yole) LiDAR for Automotive and Industrial Applications 2019 report forecasts that the automotive LiDAR market will reach $4.2B by 2024.

Autonomous car sensing from present to future

Robotic vehicles use multiple LiDARs to map their surroundings. Adoption of LiDARs in combination with cameras and radars appears necessary for a high level of redundancy between sensors to ensure passenger safety.

The case of ADAS level 3 vehicles, which become able to monitor the driving environment, is still uncertain as OEMs target partial automation. At this level of autonomy, OEMs will define the use cases where cars will be driven autonomously. Typically, OEMs currently target two use cases: traffic jams and/or highway driving. The OEM will choose when the autonomous mode can be enabled and can decide to avoid night-time conditions, for example. These choices made by OEMs, which are mainly cost-driven, will directly impact the type and number of sensors embedded in ADAS vehicles. As a consequence, and because traditional sensor performance is still improving, a combination of cameras, radars and ultrasonic sensors can be cheaper and good enough for this level of autonomy. This solution can be used by generalist OEMs, as the cost should remain acceptable for their customers. High-end OEMs and ADAS luxury ranges, with customers willing to adopt the latest technologies, might want to differentiate and offer another solution where LiDARs could be implemented. This might provide more advanced functionality or enable multiple scenarios for autonomous driving.

Autonomous cars need information about their environment in order to safely evolve. Numerous sensors are available to provide this information to autonomous cars, including cameras and Global Navigation Satellite System (GNSS) sensors. However, there are only three modalities which provide direct distance measurements: ultrasonic sensing for short distance ranging, radar for object detection, and LiDAR for 3D perception.

Today, these sensors are used in both passenger vehicles and robotic cars, but with differences between the applications. Passenger vehicles have limited autonomous capabilities, whereas robotic cars by definition must be fully autonomous and rely on sensor redundancy. Because air strongly attenuates sound waves, ultrasonic sensor ranges are limited to a few meters, and they are mainly used for parking assistance. Radars are commonly used in passenger vehicles for adaptive cruise control (ACC) and autonomous emergency braking (AEB). Therefore, in Yole’s Radar and Wireless for Automotive Report 2019, we forecast that the automotive radar market will reach more than $8B in 2024. Robotic cars use either 4D imaging radars with large antenna arrays enabling angular resolutions below 1° or tens of tiny ultra-wide band high-resolution radars. However, currently 4D imaging radars are larger than 10cm x 10cm, which makes them unpopular with regular car designers. Meanwhile ultra-wide band radar’s range is limited to few tens of meters. LiDARs, with prices of several thousand dollars, are unattractive to car manufacturers. However their high performance, with 200 meter range and 0.1° angular resolution, has made them widely adopted by robotic car companies.

These technologies will with no doubt evolve in the future. One of the motivations is to integrate them into passenger vehicles in order to achieve ADAS level 3. In 2019, 3D radars will become available, bringing vertical field of view and enabling Doppler sensing capabilities. 3D radar, camera and GNSS through sensor fusion could be the winning combination, good enough to reach the lane-level accuracy required for level 3. This type of scenario could drastically impact the LiDAR business and restrict the use of LiDAR in automotive. Furthermore, development of 4D imaging radar is expected to make it suitable for personal cars. First commercial products could be available in the 2025-2030 time frame, the right time for level 4 market penetration. For LiDAR, thanks to the use of new technologies such as MEMS optical scanners and photodetector arrays used in Flash LiDAR, LiDAR prices are expected to drop significantly in a similar time frame. Another key factor for reducing prices is the market volume effect in which large production volumes result push prices down.

The sensing industry is on fire and nobody really knows what the final sensor combination for level 4 ADAS and beyond will be. As a consequence, the industry is investing in the most promising technologies for the future. A radar and LiDAR performance and cost race has started, which is beneficial to automated driving technology. Stay tuned!

Authors:

Alexis Debray - Yole Développement

Alexis Debray, PhD is a Technology & Market Analyst, Optoelectronics at Yole Développement (Yole). As a member of the Photonics, Sensing & Display division, Alexis is today engaged in the development of technology & market reports as well as the production of custom consulting projects dedicated to the imaging industry.
After spending 2 years at the University of Tokyo to develop an expertise focused on MEMS technologies, Alexis served as a research engineer at Canon Inc. During 15 years he contributed to numerous projects of development, focused on MEMS devices, lingual prehension, and terahertz imaging devices. Alexis is the author of various scientific publications and patents. He graduated from ENSICAEN and holds a PhD in applied acoustics.

Pierrick Boulay - Yole Développement

As part of the Photonics, Sensing & Display division at Yole Développement (Yole), Pierrick Boulay works as Market and Technology Analyst in the fields of LED, OLED and Lighting Systems to carry out technical, economic and marketing analysis. He has experience in both LED lighting (general lighting, automotive lighting…) and OLED lighting. In the past, he has mostly worked in R&D department for LED lighting applications. Pierrick holds a master degree in Electronics (ESEO – France).

Cedric Malaquin - Yole Développement

As a Technology & Market Analyst, specialized in RF devices & technologies within the Power & Wireless division at Yole Développement (Yole), Cédric Malaquin is involved in the development of technology & market reports as well as the production of custom consulting projects.
Prior his mission at Yole, Cédric first served Soitec as a process integration engineer during 9 years, then as an electrical characterization engineer during 6 years. He deeply contributed to FDSOI and RFSOI products characterization. He has also authored or co-authored three patents and five international publications in the semiconductor field.
Cédric graduated from Polytech Lille in France with an engineering degree in microelectronics and material sciences.

Related reports

LiDAR for Automotive and Industrial Applications 2019
Is rationalization happening in the LiDAR market?

Radar and Wireless for Automotive: Market and Technology Trends 2019
The radar and 5G/V2X markets will both grow – one through market pull, the other through prospective enablement

Source: www.yole.fr

Related presentations

Login to access our presentations

Liked this post?

Share it on your social networks