THE EVOLUTION OF ADAS: ENHANCING SENSORY FEEDBACK FOR SAFER DRIVING

Mar 21, 2024

Present advanced driver assistance systems are relatively rudimentary applications, but more innovative advancements are now breaking into the fray

By Yagil Tzur, VP of Products at Tactile Mobility

Advancements in autonomous vehicle technologies are making the road a safer place. In the quest toward complete autonomy, vehicle manufacturers are investing in advanced driver assistance systems (ADAS), which function as a sort of co-pilot for a vehicle. These systems don’t function independently of a human driver but rather help the driver operate the vehicle more safely. That’s good news, considering human error contributes to 94 percent of serious road accidents.

Most drivers are already familiar with basic ADAS systems, including automated emergency braking (AEB), automated lane keeping (ALK) pedestrian detection and blindspot sensors, to name a few. The mechanisms here are pretty straightforward: If you get too close to another vehicle, an ADAS system will brake for you. However, these are relatively rudimentary applications, and more innovative advancements are now breaking into the fray.

One such innovation involves manufacturers giving vehicles more sensory feedback to work with. Today, most ADAS systems rely on perception sensors such as cameras, light detection and ranging (LIDAR) and radar. However, future iterations of ADAS will increasingly rely also on sensors that provide tactile inputs to understand the relationship between the vehicle and the road. Vehicles might also use multiple types of sensory feedback in a single platform to create a more comprehensive navigation system. This type of sensory data will take us one step closer to fully autonomous driving.

Promising ADAS trends

ADAS comprises a set of systems designed to make the driver’s life easier and the driving experience safer through automated responses. These systems function as a sort of stopgap on the way toward complete vehicle autonomy. Consequently, ADAS systems are rapidly advancing as interest in autonomous vehicles spikes.

To improve ADAS, manufacturers are increasingly installing multiple perception sensors. This move addresses the limitations of relying solely on one type of technology. The goal is to generate a more comprehensive picture of the vehicle’s environment by combining the strengths of each sensor. Specifically, there is a growing emphasis on developing these advanced comprehensive perception systems for tasks such as hazard detection, lane assistance and object recognition. While most ADAS systems can detect approaching objects, a common limitation is the inability to identify the object or accurately gauge its distance.

Moreover, connectivity is emerging as a pivotal feature in the evolving landscape of ADAS. Wireless networks, particularly Vehicle-to-Everything (V2X) connectivity, are gaining prominence. This allows for real-time transmission of critical data, monitoring factors like road conditions, and even weather forecasts. This also enables instant transmission of data generated by vehicle sensors to other vehicles on the road and a cloud system, allowing both human drivers and ADAS to adjust driving style based on a litany of factors and real-time data.

Beyond perception sensors and V2X connectivity, manufacturers are also exploring enhanced sensory capabilities. For one, ADAS struggles to recognize lane markings and keep the vehicle in the appropriate lane. As a result, manufacturers are pursuing ongoing enhancements in image resolution and integration with machine learning to achieve more sophisticated image processing.

Giving more senses

The demand for more sophisticated ADAS functionalities, such as adaptive cruise control that automatically adjusts vehicle speed based on the traffic, road condition, tire health, speed zones and more, is now driving innovation in sensor technologies. While ADAS already relies on a diverse set of sensors, manufacturers and software developers are going a step further. To elevate ADAS to the next level, the focus is shifting from relying solely on perception sensors to incorporating tactile sensors.

To make ADAS a truly valuable co-pilot, manufacturers must imbue the ADAS systems with senses comparable to those of a human driver, particularly a sense of touch. Next-gen sensors are now helping the vehicle “feel” the road to capture more data and react accordingly. These virtual sensors include, for example, grip estimators and tire health to capture the relationship between the vehicle and the road, “feeling” the tires and friction.

One particularly promising development here is sensor fusion, a process that combines data from different sensors to create a more detailed picture of the vehicle and its environment. ​​Sensor fusion collates information from various sensor types, such as cameras, radar, lidar, ultrasonic sensors, GPS, virtual sensors and inertial sensors, to provide a more robust and accurate representation of the vehicle and its surroundings. For example, while cameras are excellent for object recognition and identification, radar can provide information about the distance and speed of objects, and lidar can offer detailed three-dimensional mapping of the surroundings.

Fusing perception sensors is of utmost importance for ADAS systems. However, tactile data, encompassing information about the condition of vehicle tires, vehicle weight, tire grip, friction and more, is crucial for comprehending factors like braking distance, safe speed and lane keeping. Therefore, enhancing sensory sources and fusing tactile data with perception data is the sole pathway to elevate ADAS to the next level.

The idea is to create a complete picture of the vehicle’s environment with multiple data sources and to create software that analyzes all these data inputs in real time. This integration enhances the predictive accuracy and capabilities of ADAS to help the vehicle—and ultimately the driver—react to hazards and mitigate accidents.

Final thoughts

On the road to vehicle autonomy, cars are evolving to emulate the thinking, sensing and tactile capabilities of a human driver. The need for vehicles to possess more enhanced sensory perception is now propelling further ADAS research and development. These advancements will make us all a little safer and provide the best of both worlds: Two sensing, responsive operators in one vehicle at any given time—one human and one not.

 

Yagil Tzur

Yagil Tzur, VP of product

Yagil Tzur is the VP of Product at Tactile Mobility. He is a highly experienced technology and product leader, with over two decades in multiple industries. He was VP of Products at Igentify, developers of novel genomic solutions, R&D Group Manager and Program Director at Lumenis, Program Director at Applied Spectral Imaging, and Program manager at Philips Healthcare (PHG). Yagil holds an MBA from Haifa University, and a BSc in Computer Sciences from the Technion – Israel Institute of Technology.

Plain_299C

Posted originally on motor.com