Shahar Bin-Nun's Presentation at SDV Europe 2024

Dec 11, 2024

Revolutionizing Tire Health Monitoring with Tactile Mobility’s Virtual Tire Wear Sensor

We were thrilled to participate in SDV Europe 2024 and showcase our cutting-edge solutions for the automotive industry. In his engaging presentation, Shahar Bin-Nun, CEO of Tactile Mobility, introduced our latest innovation—the Virtual Tire Wear Sensor.

This groundbreaking technology offers 0.8 mm precision for real-time tire health monitoring, revolutionizing the way OEMs and fleet operators approach vehicle safety and maintenance. Shahar emphasized the sensor’s ability to deliver accurate insights without the need for additional hardware, setting a new industry standard for cost efficiency and safety.

The presentation also explored how Tactile Mobility’s AI-driven tactile data solutions empower vehicles to enhance road safety, optimize performance, and adapt seamlessly to the evolving needs of modern mobility.

Missed the event? Watch the full presentation here to discover how Tactile Mobility is driving innovation in vehicle health monitoring and the future of software-defined vehicles.

Summary of the Q&A Session:

  1. Question on Advanced Driver-Assistance Systems (ADAS) and Sensor Integration:
    • Audience Question: A participant described a situation on the German Autobahn where the ADAS system failed to react in time, requiring manual intervention. They inquired if Tactile Mobility’s virtual sensors, such as the tire wear sensor, would interact with other systems (e.g., ESP) to dynamically adjust braking distances.
    • Shahar: The integration depends on OEMs, who determine how to use the data (e.g., reflecting it to drivers or dynamically adjusting vehicle systems). They emphasized the importance of accuracy and specific KPIs for different sensors (e.g., grip estimation versus tread wear). While tire wear estimation is less sensitive, grip estimation can be critical in scenarios like icy roads. The goal is to convince OEMs to incorporate these insights into customer functions, despite challenges in communicating directly with relevant teams.
  2. Question on AI and Machine Learning (ML) Implementation:
    • Audience Question: A participant asked whether the AI/ML processes occur in the vehicle or in the cloud, and how Tactile Mobility prevents model divergence between vehicles.
    • Shahar: AI/ML operates in real-time via edge computing within the vehicle’s ECU, where a “vehicle DNA” is maintained and updated through calibration each time the engine is started. Differences between vehicles are handled through these calibrations. While data normalization and learning happen in the cloud for broader insights, denormalizing data to send back to vehicles is more complex and remains a work in progress.

Key Takeaways:

  • Tactile Mobility’s sensors offer capabilities that OEMs can customize for various applications, including ADAS.
  • AI/ML processes involve both vehicle-specific edge computing and cloud-based normalization for broader learning.
  • Engagement with industry professionals through workshops highlights Tactile Mobility’s collaborative approach to addressing automotive challenges.