Touchy-feely self-driving cars of the future will feel the road beneath their tires, providing an optimal driving experience with maximum safety.
Human drivers subconsciously respond to the feelings and sensations caused by the bumps, dynamics and changing grip of the road beneath their wheels. By comparison, autonomous cars sense the condition of the road using cameras, radar, lidar and other sensors. But that’s about to change.
Cars have feelings, too
Israeli start-up Tactile Mobility is on a mission to help autonomous cars feel the road beneath their wheels, providing valuable data to ensure a safer, smarter ride.
“Where other companies enable smart and autonomous vehicles to see the road, we enable those vehicles to feel the road,” Tactile Mobility CEO Amit Nisenbaum recently told The Jerusalem Post. “There is always a compromise between safety and user experience in autonomous vehicles. They rely on cameras and lidar sensors, know the vehicle speed, the speed of the vehicle in front, its distance from your vehicle, and calculate the safe distance of the vehicle given relative velocities, but they don’t know the road’s grip level.”
Traditionally, the autonomous car has had to assume that there is always a low-grip level on the road, but of course that doesn’t work so well in terms of providing the ultimate driving experience for passengers.
Touchy-feely smarter cars
Adding a human-like sense of touch and an ability to feel the road surface better is clearly a necessity in order to ensure self-driving cars are both safe and able to provide the ultimate user experience, whatever the road conditions. “We do it with software only and based on data generated by multiple, non-visual existing sensors,” explains Nisenbaum.
Tactile Mobility applies machine learning tech to data gathered from numerous vehicles, which is sent back to the car’s computers for maximum safety and performance.
The company’s software runs algorithms and artificial intelligence to provide the autonomous car with rich data and insights on all of the physical road factors that impact the driving experience.
“The more computerized that vehicles are becoming – and not just fully-autonomous vehicles – the computers need to have an additional sense of tactility as well,” explains Nisenbaum.
Understanding the road beyond the line of sight
NIRA Dynamics is using the scalability of the HERE Open Location Platform to map traction conditions and improve road safety via intelligent traction control. Building algorithms that combine numerous sources of data – including tire pressure, loose wheels, road conditions, precipitation, ambient temperature – helps their engineers map the changing dynamics of road conditions and friction.
“Our core business is selling friction data, we are a friction expert and HERE is the location expert,” explains Per Magnusson, Project Manager at NIRA Dynamics. “We can use different types of data with friction data such as traffic to enhance the understanding of the road beyond the line of sight.”
One thing is for sure: using these types of touchy-feely technologies outlined above, autonomous cars of the future will provide much improved levels of safety for drivers and passengers alike.
Originally posted at: https://360.here.com/autonomous-touch-lets-cars-feel-the-road-ahead