Sensor fusion – key components for autonomous driving

For vehicles to be able to drive autonomously, they must perceive their surroundings with the help of sensors: An overview of camera, radar, ultrasonic and LiDAR sensors.

A VW Beetle is beautiful to look at, but it does not appreciate its surroundings. However, capturing the environment, as we humans do with our senses, is essential for cars to be able to drive autonomously. Modern vehicles are therefore equipped with a wide variety of sensors that help them detect their surroundings and thus support drivers or even relieve them of certain tasks such as parking. Below is an overview of the various sensors required for autonomous driving.

The most important vehicle sensors for perceiving the environment are cameras, radar, ultrasonic, and LiDAR sensors. With the exception of cameras, they are all based on the time-of-flight principle.

How do all these perception technologies differ and what are their strengths and weaknesses? Which distance sensor is best suited for autonomous driving? LiDAR versus radar? Will the cars of the future navigate with the help of cameras, or should manufacturers rely on sensor fusion?

Time-of-flight principle in brief:

Time-of-flight indirectly measures distances and speeds based on the time it takes for a signal to hit an object and be reflected back. This principle can be found in the animal kingdom and is also known as echolocation, which is used by dolphins or bats for orientation.

A look at the individual technologies provides some helpful information:

Color vision thanks to cameras

Camera Field of View around the car. Sensors for Autonomous Driving

Cameras are already an integral part of new production vehicles: They make maneuvering and parking easier. Moreover, cameras enable the use of systems such as adaptive cruise control or lane departure warning while driving. In addition to being installed outside the vehicle, cameras will also be used in interiors in the near future. They will detect whether drivers are distracted, not wearing their seat belts, or tired, for example. This is particularly important for the next stages of development in autonomous driving – such as when the vehicle is driving in motorway pilot mode and the driver always needs to be ready to take control.

High-resolution color images depending on environmental factors

Camera recordings show a visual representation of the world in color. Apart from the colors, they can also provide texture and contrast data. This information can be evaluated with the aid of software, for example in order to reliably identify a road marking or a traffic sign. Both static and moving objects can be precisely detected and identified. Since the camera technology is based on a passive measuring principle, objects are only detected if they are illuminated. The reliability of cameras is therefore limited in difficult environmental conditions such as snow, ice, or fog and in darkness. Furthermore, cameras do not provide distance information. In order to obtain 3D images, at least two cameras are required, as is the case with stereo cameras, or image recognition software, which requires a high computing performance.

Deep dive: Mono vs. stereo cameras
In general, two different systems are used: Mono and stereo cameras. What are the differences between these systems?

Mono cameras (one “eye”) have one camera lens and one image sensor and provide 2D images. These images are used, among other things, as the basis for lane assistants, recognizing traffic signs, and smart headlight control. Distance measurements are, however, not possible. Distances can only be calculated using complex, often self-learning algorithms.

Stereo cameras (two “eyes”) are the more expensive and larger systems. They consist of two camera lenses and two image sensors. Stereo cameras take two images from different angles simultaneously. A 3D image is created by matching them. This enables distances and speeds to be calculated, making it possible to estimate distances. Stereo cameras are already being used in some production vehicles and provide information for driver assistance systems such as adaptive cruise control and emergency brake assist.

Radar – established distance sensor

Radar sensors (radio detection and ranging) have gained widespread fame thanks to the so-called “radar traps.” In recent decades they have also been installed in vehicles to measure distances in order to obtain reliable data for systems such as the spacer and the emergency brake assistant, regardless of weather conditions.

Radar sensors around the vehicle

How do radar sensors measure distances? Radar technology is based on the time-of-flight principle. The sensors emit short pulses in the form of electromagnetic waves (radio waves), which are propagated almost at the speed of light. As soon as the waves hit an object, they are reflected and bounce back to the sensor.

The shorter the time interval between transmission and reception, the closer the object is.

Based on the speed at which the waves are propagated, the distance to the object can thus be calculated so that distances can be determined with great precision. By stringing together several measurements, the vehicle sensors can also determine speeds. This technology enables the use of driver assistance systems such as adaptive cruise control and collision avoidance.

Weather-resistant with limited resolution

Radar sensors are robust, inexpensive, and usually provide reliable data even in adverse weather conditions. However, distance sensors have greater difficulty in identifying and differentiating between objects. The reason for this is the low resolution of radar data, which means that objects can be detected but not classified.

Strengths and weaknesses of the radar as a sensor for autonomous driving

Deep dive: Short-range vs. long-range radar
Today, two different radar systems are mostly used to cover both the short and the long ranges.

Short-range radar: The close range (up to 30 meters) is detected by short-range radar, which is usually based on a frequency band in the 24 GHz spectrum. It is compact, has low interference problems, and is the less expensive version. Short-range radar facilitates parking maneuvers, monitors blind spots, and warns the driver of collisions.

Long-range radar: Long-range radar is used to detect objects and vehicles at distances of up to 250 meters and measure their speed. This technology uses frequencies between 76 GHz and 77 GHz and has a higher performance. However, due to the low resolution, objects at great distances cannot always be reliably selected. As long-range radar enables, among other things, emergency brake assist and adaptive cruise control even at high speeds, it plays an important role in implementing the next steps in autonomous driving, such as motorway pilots.

Ultrasound – close-range specialist

There is hardly a vehicle nowadays that is not equipped with a parking aid. If the vehicle approaches a parking post, for example, a warning tone sounds and colored bars are displayed on the on-board computer. These warning signals provide information about where exactly the post is located in the monitored area and thus in the direct vicinity of the vehicle. This assistance system is made possible by several ultrasonic sensors, which are usually installed in the bumpers around the vehicle.

Ultrasonic sensors around the vehicle

Ultrasound is also based on the time-of-flight principle. Here, sound waves, inaudible to the human ear, are emitted at a frequency of 20,000 Hz. Apart from parking assistance, ultrasonic sensors are also used to monitor the blind spot and for emergency brake assistants.

Compact distance sensor with limited range

Radar sensors are robust, inexpensive, and usually provide reliable data even in adverse Ultrasonic sensors are robust and provide reliable distance data, both at night and in fog. They are also cost-effective and capable of detecting objects, regardless of material or color. However, the range of these vehicle sensors is limited to less than 10 meters, which means this technology can only be used at close range.

Deep dive: Sonar
The term sonar is often used in connection with ultrasound, i.e. the application of ultrasound in the maritime sector.
Sonar (sound navigation and ranging): Sonar is a measuring technique that uses sound waves, usually ultrasound, for localization. It is mainly used under water, as the propagation of sound, especially at high frequencies, is much less lossy than in the air. The distance can be calculated based on the speed of the sound under water and the reflection time from the object.

LiDAR – reliable environmental information in 3D

In contrast to ultrasonic sensors, LiDAR (light detection and ranging) sensors are suitable for both short- and long-range use. Although they have existed for many years, they have only been used increasingly in vehicles since the 2000s. LiDAR is considered a key technology for achieving higher levels of autonomy.

Frontfacing LiDAR in a vehicle for long range

Key technology LiDAR: An essential premise for the next level of autonomous driving is the avoidance of collisions. This technology requires reliable, high-resolution 3D data. Only LiDAR provides this data even at high speed and over a long range.

LiDAR sensors are also based on the principle of time-of-flight principle. Instead of radio or ultrasonic waves, however, they emit laser pulses that are reflected by an object and picked up again by a photodetector. LiDAR sensors emit up to one million laser pulses per second and summarize the results in a high-resolution 3D map of the environment.

High resolution at long range

These so-called point clouds are so detailed that objects cannot only be recognized, but also categorized. For example, a pedestrian can be distinguished from a cyclist. LiDAR sensors have a long range, are robust, and therefore provide reliable data mostly independent of environmental factors, enabling vehicles to make the right driving decisions. In the past, however, the sensors were often very expensive, mainly due to the complex and maintenance-intensive design of the mechanically rotating devices. However, thanks to their solid-state design, which is becoming increasingly established, the cost of high-resolution 3D sensors is being reduced quite considerably.

Deep dive: Rotating vs. solid-state LiDAR
Two of the most popular LiDAR systems are the rotating sensors and the solid-state sensors.

Mechanically rotating LiDAR systems: Mechanical systems use gears and motors to rotate the laser diodes and thus direct the laser pulses over the environment. The rotation makes a field of view of up to 360° possible. However, the manual setup is complex and cost-intensive. Even in large quantities, unit prices are therefore too expensive for use in series production vehicles. Due to their design, mechanically rotating sensors are also more sensitive to vibrations, for example. They are currently used in robot taxi fleets, among other things.

Solid-state LiDAR systems: This design is based on semiconductor technology and does not have any mechanical moving parts. The systems are therefore less complex, more compact, and maintenance-free. They are also less expensive and can be better produced in series. Solid-state LiDAR systems thus play a decisive role towards achieving the next level in autonomous driving, as they can be used in series-production vehicles of all classes.

Leveraging strengths through sensor fusion

Sensor fusion with camera, radar, ultrasonic and LiDAR

Safety is the top priority for autonomous driving and therefore vehicles must always have a detailed view of their surroundings. To make this possible, camera, radar, ultrasound, and LiDAR sensors can assist one another as complementary technologies.

The main aim is to use the strengths of the various vehicle sensors to compensate for the weaknesses of others and thus ultimately enable safe autonomous driving with sensor fusion.

Don’t want to miss any news?

Subscribe to our newsletter for regular updates straight to your inbox.

You may also be interested in

Levels of autonomous driving
Florian Petit
The Future of Mobility
The development stages up to the autonomous vehicle are divided into five levels. What does this classification look like and how are the individual levels of autonomous driving characterized?
Florian Petit
The Future of Mobility
An average AV would generate around 19 terabytes of data per hour, earning them the title “data centers on wheels". What is the status quo regarding data privacy? How can these vast amounts of data be protected while not hindering the advance of AV technology development?
Terje Noevig
The Future of Mobility
Years ago, many automotive companies – new and incumbent – promised a rapid rise in autonomous mobility. Time has proven that mastering autonomous vehicles is a task much harder than initially thought. There are a few core elements that are often overlooked but are required to make autonomous driving a reality.

Next events

Logo Perimeter Protection

Perimeter Protection Nürnberg

January 17
- 19, 2023
Logo Solids Dortmund

Solids Dortmund

March 29
- 30, 2023