How multi-sensory data enhances advanced driver assistance systems (ADAS)

Jul 31, 2024

This article takes a look at the current state of advanced driver assistance systems (ADAS) and where it’s headed.

As cars have become intelligent, manufacturers have focused on implementing certain smart features at the expense of others. Sensors tied to rear view cameras and blind spot warnings are now nearly ubiquitous, but other safety tools have lagged behind. This is delaying the progress of advanced driver assistance systems (ADAS), a suite of technologies that assist drivers with the safe operation of a vehicle.

The problem is the majority of smart safety features rely on perception sensors. To make ADAS truly valuable, manufacturers must imbue ADAS with senses comparable to those of a human driver. Vehicles need to feel the road, not just see it. With that in mind, let’s take a look at the current state of ADAS and where it’s headed.

Current ADAS trends

Perception sensors are now commonplace in vehicles. What’s good is that more perception sensors are emerging to fulfill different functions, so vehicles do not rely on one technology but have a diverse and robust arsenal to improve safety and performance. These include blind spot monitoring sensors, lane departure warning systems, and forward collision sensors.

While perception sensors once relied only on cameras, additional sensors such as radar and light detection and ranging (lidar) technology are emerging as powerful complementary tools. Lidar systems use laser beams to measure distances and create detailed, high-resolution maps of the surroundings. This technology enhances object detection and provides a more nuanced understanding of the environment, especially in challenging conditions such as low visibility or adverse weather.

Radar and lidar help determine the distance of a vehicle from an object which is the main deficiency of cameras. We’re seeing increased push to enhance visual information with more accurate distance detection and object recognition. Detecting the proximity of one vehicle to another more accurately and activating emergency braking if a vehicle becomes too close to another is an example of the enhanced safety offered by stacking different perception technologies together.

Incorporating “touch”

The next frontier in ADAS evolution will be incorporating the sense of touch into vehicles. This marks a departure from the predominantly visual-centric approach, introducing tactile elements that enhance the driving experience and safety.

For instance, tactile sensors can provide valuable information to both the driver and the vehicle’s systems on tire grip. Tactile sensors can assess the coefficient of friction between the tires and the road, providing crucial data for understanding vehicle-road dynamics. These capabilities allow ADAS to maintain its effectiveness and safety in challenging situations such as poor road conditions, distresses, high roughness, and so on.

Tire grip sensors are actively assessing road and weather conditions and their data is accessible both to the human driver and to ADAS systems. This data enables ADAS systems to work in almost any condition. They also inform the driver, which allows the driver to make driving adjustments tailored to the current conditions. This approach represents a significant advance from traditional reactive alerts, offering drivers proactive guidance and recommendations to enhance safety and adaptability in a dynamically changing environment.

Sensor fusion

As we look to the future, the integration of all of these various sensors – known as sensor fusion – will be a key development in the evolution of ADAS. Sensor fusion involves combining data from different physical and virtual sensors to create a more complete picture of the vehicle’s surroundings. This process enables the vehicle to have a better understanding of its environment, including weather conditions, road hazards, nearby objects, and speed zones.

Embracing sensor fusion is essential, not only to overcome the individual limitations of each sensor but also to harness their collective strengths. This approach includes both redundancy, by using multiple sensors of the same type, and synergy, in which different types of sensors complement and enhance each other’s capabilities.

Sensor fusion not only enhances safety but is also crucial for developing fully autonomous driving. It’s the key to transitioning from advanced driver assistance to complete vehicle autonomy. Sensor fusion enables vehicles to independently navigate and respond to complex road situations, potentially surpassing human decision-making capabilities.

Final thoughts

For true vehicle autonomy, it’s essential that cars do more than just see the road; they must also be able to sense it in a manner akin to a human driver. Visual perception is a critical component, but it’s just one part of the complex autonomy puzzle. Vehicle manufacturers need to integrate and fuse various sensory inputs, advancing ADAS from a mere assistant to a fully capable autonomous system.

Shahar Bin-Nun – Tactile Mobility
Shahar Bin-Nun is CEO of Tactile Mobility. He has 21 years of experience in global sales, marketing and business development. He previously served as the CEO of HumanEyes Technologies, a VR company with over 70 patents in various 3D fields and computer vision. Prior to his tenure at HumanEyes, based in the U.S., he served as VP sales & business development for Press-sense Inc., a provider of software solutions to the printing industry, VP sales at Magink Display and CEO of CTV Tech.

News & Events - Parasoft

Published originally on embedded.com