Paul Ryabchuk
Blog post

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Find out how the fusion of different types of autonomous vehicle sensors can overcome road incidents

10 mins read

5.0 1

Car manufacturers together with automotive technology development companies have already taken a huge step toward making autonomous driving genuinely feasible. But to be honest, there’s still a lot of work to be done to make self-driving cars commonplace on city streets. Autonomous vehicles have a lot of drawbacks and limitations, and on-road safety is one of the biggest challenges holding back the industry.

But don’t worry—there’s a solution. For more than a decade, Intellias has been working closely with automotive and automotive-related businesses. We know that advanced software technologies are key to overcoming current challenges. The right combination of sensor fusion algorithms and machine learning to react to sensor data can bring autonomous driving to a new level of safety.

Last year, 78% of Americans admitted that they would be afraid to ride in an autonomous vehicle. Safety concerns are the main thing preventing people from embracing self-driving technology. For people to trust an autonomous vehicle, they need to know how it sees the road and interprets obstacles.

78% of Americans are afraid to ride in an autonomous vehicle primarily due to safety concerns.

In this regard, the role of sensors that recognize the road environment can hardly be overestimated. Let’s see how the right mix of sensors in autonomous vehicles can make self-driving safe and build trust between vehicles and users.

Types of autonomous vehicle sensors 

An Advanced Driver Assistance System (ADAS) lies at the core of self-driving technology. In most cases, ADAS uses information from different types of autonomous vehicle sensors, including cameras, radar, and Lidar. Each type of sensor used in self-driving cars has its own specific tasks and its own location on the car.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Cameras for visual perception of the environment

Rear and 360-degree cameras are commonly used for assisted driving and in partially automated cars. Cameras are also indispensable as a data source for highly automated vehicles. The information received from cameras is used to create a 3D computer image of the surrounding environment, usually by means of interpreting input signals from four to six cameras located on the front, sides, and back of a vehicle. These cameras’ image sensors need to have dynamic ranges of more than 130 dB with at least 24-bit image signal processors to make an accurate 3D image of the car’s location.

Read more: Learn about the integral parts of computer vision that enable autonomous cars to comprehend the physical environment

Today, rear and 360-degree video cameras are usually organized into a centralized system that’s operated by an electronic control unit. A control unit is required to process and interpret the information received from all image sensors in order to use that information to affect the car’s on-road behavior. But a centralized system for interpreting signals is rather insecure and risky, and unexpected events occurring on the road simultaneously may not be processed promptly and effectively.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Source: Three Sensor Types Drive Autonomous Vehicles

The alternative is a decentralized approach involving only smart cameras and a head unit. In this case, the mapping process is done in two stages. In the first stage, the received images are interpreted, compressed, and streamed while the fisheye model is created simultaneously. The second stage involves decoding the signal and displaying an image on the screen inside the vehicle. A decentralized system—without a camera control unit—enables the car to react more quickly to multiple simultaneous road events.

A decentralized system—without a camera control unit—enables a car to react more quickly to multiple simultaneous events.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Source: Three Sensor Types Drive Autonomous Vehicles

Radar sensors to detect moving and stationary targets

Radio detection and ranging (radar) sensors detect and pinpoint objects using radio waves. The sensitivity of radar sensors used by auto manufacturers generally varies from 24 GHz to 77 GHz. The more sensitive the radar system used, the more accurate the measurements of speed and distance in relation to moving objects and obstacles on the road. Radar applications are classified as short-range, mid-range, or long-range depending on the area covered by the radar.

Mid-range and long-range radar applications are responsible for brake assistance, distance and speed control, and emergency braking. Yet in terms of maintaining fully autonomous self-driving technology, short-range radar plays a key role in:

  • Blind spot detection (blind spot monitoring) 
  • Lane and the lane-change assistance 
  • Collision warning and collision avoidance 
  • Parking assistance

Radar systems for vehicles were initially designed to replace less accurate ultrasonic sensors. Short-range radio wave sensors are located at a vehicle’s corners, while those used for mid-range and long-range detection are usually placed at the front and back. In most cases, information received via radar is processed through a monolithic microwave integrated circuit (MMIC) or is interpreted as raw data by an electronic control unit.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Source: Three Sensor Types Drive Autonomous Vehicles

Read more: Find out the main tech pillars of autonomous driving that will introduce self-driving vehicles into our everyday routines

Road control with laser transmission via Lidar system

A light identification, detection, and ranging (Lidar) system is used to measure distance in relation to both static and moving objects on the road. By means of processing the data received from laser transmitters, a Lidar system builds a detailed three-dimensional image of on-road objects. Car manufacturers can use complex Lidar-based mirror systems for capturing spatial images of objects located in the full 360-degree area around the vehicle.

A light identification, detection, and ranging (Lidar) system is used to measure distance in relation to both static and moving objects on the road.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Source: Three Sensor Types Drive Autonomous Vehicles

There are two main approaches to using Lidar in autonomous vehicles. The first uses Lidar as a self-sufficient system, while the other uses it in combination with a Micro-Electro-Mechanical System (MEMS). Lidar as a solid-state technology involves the use of several laser diodes and a high-powered receiver. If supporting MEMS technology is applied, additional movable micro mirrors are used, making the system more accurate yet more complex to use.

The cost of ADAS depends on what kind of sensors are used in self-driving cars. Lidar systems remain more expensive compared to cameras and radar sensors, so mass use in the near future is questionable. Some major industry players have already expressed their determination to develop fully autonomous technology without Lidar sensors. Telsa’s Elon Musk is a major opponent of Lidar, stating that advanced camera sensors with computer vision algorithms will soon be enough to make self-driving feasible, safe, and effective.

Contact Intellias automotive experts who know exactly how your business can benefit from sensor fusion technology
Get in touch

Sensor fusion for autonomous driving 

Each individual type of autonomous vehicle sensor system has its own limitations and shortcomings. For example, a camera sensor’s resolution may not be enough to work accurately in rain, fog, or sun glare. In these situations, information from radar sensors may help the vehicle continue to drive reliably, as bad weather conditions have little impact on radar’s effectiveness. The idea of sensor fusion for autonomous driving is to use a mix of sensors to make up for the drawbacks of each individual sensor. Let’s take a closer look at specific examples of multi-sensor data fusion for autonomous vehicles.

How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road

Fusion of front camera and multimode front radar

Radar sensors are one of the most effective ADAS tools to measure speed and distance in relation to both static and moving objects located within 150 meters of a vehicle. One of the major advantages of radar is its effectiveness under all kinds of weather conditions. At the same time, camera sensors are best at differentiating objects on the road, including road markings and street signs.

So a combination of front radar sensors and camera sensors with different fields of view can enable a self-driving car to identify both moving and static objects within 150 meters of a vehicle under any weather conditions. This, in turn, permits secure and reliable use of the emergency braking system and cruise control function as well as safe stop-and-go, lane change, and other maneuvers.

A front radar sensor combined with cameras enables a vehicle to identify both moving and static objects within 150 meters under any weather conditions.

Fusion of rearview cameras with ultrasonic or radar sensors

The combination of a rear-view camera with ultrasonic or radar sensors is an example of how sensor fusion can assist an autonomous vehicle in parking. Today, ultrasonic sensors to identify objects and measure distances to them are a mature technology within the automotive industry. The current trend is to replace these ultrasonic sensors with more functional and accurate radar sensors, making parking-related maneuvers even safer and more feasible.

Rear-view cameras provide information regarding the environment and objects behind the car, which is necessary for the vehicle to determine and execute the most reasonable type of maneuver. At the same time, ultrasonic or radar sensors help to avoid accidents by accurately evaluating distances to static and moving objects around a vehicle. The point is that using data received from radar or ultrasonic sensors together with a rear-view camera allows for advanced parking features that are not possible with only one type of sensor.

Radar or ultrasonic sensors together with a rear-view camera allow for parking features that would be impossible with other types of sensors.

Integrating platforms based on sensor fusion algorithms

The sensors used in autonomous vehicles generate a vast amount of raw data that requires further processing and integration. Data integration platforms play a key role in enabling a vehicle to accurately assess its surroundings using information received via sensors. And partnerships between original equipment manufacturers and software developers play a decisive role in building reliable and effective integration platforms.

Read more: Learn about how manufacturers and software developers are resolving the issues challenging self-driving technology

BlueBox by NXP

One of the most successful examples of sensor fusion integration technologies is the BlueBox platform developed by automotive vendor NXP. BlueBox was designed to combine sensor fusion, analysis, and complex networking. Put simply, BlueBox is a centralized controller that collects raw data from different types of sensors, analyzes it, and determines the most appropriate vehicle behavior.

Technologically speaking, the BlueBox consists of two main elements. The first is a networking device that processes data received from radar, cameras, Lidar, and other vehicles, while the second element is a safety controller aimed primarily at reinforcing the security of autonomous driving. The working capacity of the platform is limited to 90,000 million instructions per second (DMIPS).

DRIVE PX by NVIDIA

The DRIVE PX integration platform was developed by NVIDIA in cooperation with vendors Elektrobit and Infineon. The platform is run on NVIDIA’s self-driving car computer, supported by an AUTOSAR 4-x complaint software suite using NVIDIA’s Tegra and Infineon’s AURIX 32-bit TriCore microcontroller.

The DRIVE PX platform is capable of processing information received from a maximum of 12 camera sensors along with radar, Lidar, and ultrasonic sensors. The processing capacity of the platform is enough to enable a self-driving car to operate effectively in a 360-degree environment with changing static and dynamic objects. Innovative use of a Deep Neural Network (DNN) for detecting and identifying objects is definitely one the major virtues of DRIVE PX.

Yet integration platforms and sensor fusion technologies can be genuinely effective only in combination with software solutions. Cloud-based platforms powered by real-time data are an example of such a solution. Shared access to open cloud-based platforms containing records of real-time and historical sensor data, multilayer maps, and predictive machine learning is a prerequisite to safe and secure driving.

Read more: Learn from our case study about how Intellias’ experts are developing a data platform for autonomous driving

Over-the-air map update solutions is another example of how to develop an ecosystem for safe autonomous driving. In particular, cloud-based over-the-air updates for maps can be used to convert atypical map formats to meet the Navigation Data Standard, reinforcing the security and feasibility of self-driving vehicles. Similarly, data-driven protocols can help advanced driver assistance systems to perform better in cruise control mode by optimizing lane assistance and controlling the powertrain depending on slope, curvature, and speed limit.


Automotive sensor fusion is a core element underlying the effective and sustainable performance of advanced driver assistance systems. The smart processing and sharing of data received via different types of sensors in autonomous vehicles, including cameras, radar, and Lidar, is a prerequisite for secure and practical autonomous driving. Advanced software solutions can build trust between vehicles and users, removing obstacles to the mass adoption of self-driving vehicles. By being at the forefront of sensor fusion technology you can become a trendsetter within one of the most promising and rapidly developing market segments.


Our Intellias experts know exactly how your business can benefit from sensor fusion technology. Contact us and get a step ahead of your competitors in the race for industry leadership.

Your subscription is confirmed.
Thank you for being with us.

5.0 Thank you for your vote. 9014 db25b495a5

Thank you for your message.
We will get back to you shortly.