The software-only solution uses “sense of touch” technology to detect a variety of objects on the road of different heights, sizes, shapes, and materials – both organic and hard – such as a human body, road debris and other objects. The virtual sensor then prevents the vehicle from running over the object, harming human life, or damaging the vehicle. The safety level virtual sensor will be added on top of the Tactile Processor Platform, which already includes the company’s suite of virtual sensors such as grip estimation, tire health, surface sensing, vehicle health and much more.
According to NHTSA hundreds of children are killed and thousands are injured every year in nontraffic crashes in parking lots, driveways and private roadways. Runover virtual sensors in both autonomous vehicles and vehicles with ADAS systems can help reduce the human death toll by sending signals to these vehicles that alert the car and driver at distinct stages of the runover. The new virtual sensor will enable another critical safety function, allowing vehicles to sense the road, identify the type of material under their tires and alert an initial run-over, preventing vehicles from fully running over objects that could lead to a fatal incident.
Three in four Americans are afraid to ride in fully self-driving vehicles. For autonomous vehicles to be trusted by the mainstream, they must be far safer than human-controlled vehicles. To achieve this, they must respond to vehicle-road dynamics just as – or better than – human drivers do; they must be able to not only “see” the road that lies ahead, but also “feel” the road friction, roughness, curves, grades, distresses and objects in the roads under their tires. Runover sensors enable vehicles to sense the road and react to obstacles, hazards, and vulnerable objects, significantly mitigating the damage.