One step closer to autonomous driving: Seamless integration of the sensor stack

Years ago, many automotive companies – new and incumbent – promised a rapid rise in autonomous mobility. Time has proven that mastering autonomous vehicles is a task much harder than initially thought. There are a few core elements that are often overlooked but are required to make autonomous driving a reality.

Today, there are around 1.3 billion vehicles driving on our streets around the globe. These vehicles are steered by us humans. The human “sensor stack” consists of several sensing modalities and the most important ones for driving vehicles are our eyes and ears. With our eyes, we try to cover a 360° view around the vehicle for varying distances – short and long. To reach these 360° we are supported by three mirrors: two side view mirrors and one rear view mirror. All information captured by the “sensor head” is then processed on a central processing unit – our brain – and mixed with our driving experience and driving rules in navigating ton-heavy vehicles at various speeds and environments. Additionally, the automotive industry invented a number of ADAS features to make our lives easier, safer and more convenient such as cruise control, lane assist or emergency braking.

Looking at the sensor stack today with a set of human eyes and three mirrors – we are all very well aware of our blind spots, reaction times, and attention deficits. It is clear that there remains a residual risk that society has accepted. To put it into numbers, that residual risk in Germany alone amounts to more than 2,600,000 accidents of which 300,000 accidents are with bodily harm every year.

How to imitate the human “sensor stack”

If we now turn our view towards Autonomous Vehicles (AVs), the industry will replicate the human driver and its sensor stack with various machine-based sensors; first and foremost, radar, camera, and LiDAR and to some extent microphones, ultrasound, or other modalities. In order to get a 360° coverage of the vehicle as well as a short and long-range view, several sensors of varying types are needed. As a rule of thumb, experts are looking at 30-40 sensors per vehicle. Additionally, all the captured information needs to be processed and fused with high definition maps so a powerful computing system is required to “digest” the huge data streams from these sensors and to make milli-second decisions. It is undisputed that machine-based sensors will capture the environment much better than a single human being but interpreting the data correctly and taking the right actions is a huge challenge.

Looking at today’s physical structure of AVs, they have all kinds of attachments on the roof or on the sides as many of us have seen on the test vehicles on the road. They meet the requirements for testing purposes and enable the positioning of the sensors exactly where they need to be for a 360° coverage in short & long-range. But this set-up is not mass market ready and there are a few challenges apart from performance, cost and software that need to be addressed.

Vulnerable road users need to be protected

Safety is a largely discussed topic when it comes to AVs. And while it is essential that the vehicle itself navigates safely, it is also important to take a look at how to protect road users outside of the automated driving functions. NCAP (New Car Assessment Program) is today’s measure on how well a vehicle protects its occupants as well as road users across the globe . If today’s test AVs were evaluated in an NCAP test, they most likely would get very low ratings, because even though AVs will be safer than human-driven vehicles, there will still be accidents. And as of now the large number of sensors sticking out of the vehicle massively increases the danger for bodily harm especially for vulnerable road users (VRUs) be it pedestrians, bicyclists or motorcyclists.

vulnarble road users crossing the street

Next is the challenge of environmental conditions like rain, fog, snow, ice, insects or dust. The AV sensors need to function at all times, so heating and cleaning is a core issue. The sensors need to be part of the overall vehicle design including e.g. cleaning liquid pipes, to ensure reliable operation. 

Power consumption is an important factor

The automotive industry is fighting for every kilometer of range on electric vehicles (EVs) and every gram of CO2 for fossil fuel-powered vehicles (vehicles with Internal Combustion Engines – ICEs). An AV has 30-40 sensors, each consuming between 2-25W, plus one or multiple additional processing units. This amounts to ~1300W additional power consumption and would add ~35g of CO2 / km to an ICE and ~11g CO2 / km to an EV (based on the WLTP cycle) in the ideal case. And this estimation does not account for the additional air drag or weight. Air drag becomes a severe issue already at 80km/h in power consumption. Two things are therefore very clear: First, the automotive industry needs to electrify before they automate to allow for the additional Co2 footprint. And second, for adopting AEVs (Autonomous Electric Vehicles) it is essential to integrate the sensor stack completely into the vehicle design for aerodynamic reasons to minimise air drag and avoid adding unnecessarily to the power consumption.

Design and Style is a major selling argument

Apart from safety and power consumption, vehicle and car design signature and style also play a major role. Just imagine a premium car with all kinds of sensors sticking out of the vehicle body – unthinkable! Another important aspect to consider is vandalism and theft. Current test AVs sadly are often the target of vandalism and as some sensors are very expensive, it is essential to protect the stack from harm or theft.

Solution: Seamless integration of the sensor stack

All these challenges; safety requirements, power consumption, and design factors, can be met by integrating the sensor stack seamlessly into the vehicle. Placements can vary from headlights, roof, grill, bumpers to side view mirrors and A-, B- or C-pillars to achieve a 360° surround view around the vehicle.

Sensor placement options around the vehicle for 360° coverage

By fully integrating the sensors the risk of bodily harm of road users is reduced, cleaning and heating solutions can be easily developed, the sensors do not increase the air drag and subsequently “only” add to the power consumption intrinsically. To include the sensor stack into the design, it is crucial to develop sensors that are small in size and weight while offering the largest field of view possible and ideally can be placed anywhere on the vehicle.

Vision Mini
Blickfeld’s ultra-compact mid-range automotive LiDAR comes in dimensions as small as 5 x 5 x 5 cm and can be seamlessly integrated around the vehicle for a 360° coverage.

Wider perspective required for mass market adoption of AEVs

Typically, the discussion for AVs is around cost, performance and the software stack. However, to make mass market AVs a reality, a wider perspective is needed. The aforementioned topics may not be on everyone’s agenda at first thought but certainly need to be addressed. Looking at reality shows that AVs require pushing sensor and processing technology forward and also focus on a 360° perspective to make our lives safer, more energy-efficient and convenient.

Don’t want to miss any news?

Subscribe to our newsletter for regular updates straight to your inbox.

You may also be interested in

Florian Petit
The Future of Mobility
An average AV would generate around 19 terabytes of data per hour, earning them the title “data centers on wheels". What is the status quo regarding data privacy? How can these vast amounts of data be protected while not hindering the advance of AV technology development?
Florian Petit
The Future of Mobility
There are currently two different approaches to achieving fully autonomous driving. Either building up from level 1 to level 5, like many OEMs aim to do right now, or diving right into Level 4 autonomy, as currently pursued by the tech companies. What are the advantages of each approach and what is their current status? Is it possible to bridge the two approaches?
Levels of autonomous driving
Florian Petit
The Future of Mobility
The development stages up to the autonomous vehicle are divided into five levels. What does this classification look like and how are the individual levels of autonomous driving characterized?

Next events

Logo Perimeter Protection

Perimeter Protection Nürnberg

January 17
- 19, 2023
Nürnberg
Logo Solids Dortmund

Solids Dortmund

March 29
- 30, 2023
Dortmund