The learning intelligent steering wheel is a steering wheel that has optical touch sensors and LED's (with various colours) around the rim. When the driver is holding his hands on the wheel, the touch sensors generates touch data in real time. The sensor data can be used to understand the position of the driver's hands, how he moves his hands, and more specifically it can be used to learn and understand gestures on the steering wheel.
In its simplest form, it can detect hands-on and hands-off, an important feature in self-driving cars. In a more advances form, the touch data can be used to learn and understand gestures.
These gestures can be used to control various functions in the car, replacing current controls and stokes so that the driver can keep his hands on the steering wheel, and his eyes on the road. Something that will increase safety.
The LED's can be used for communication between the car and the driver.
The gestures and the LED's are ways of communication between the driver and the car, normally called Human Machine interaction (HMI). The importance of HMI cannot be understated as a central component as a central component in the autonomus driving revolution.
The learning intelligent steering wheel will continue to learn, it will become more and more intelligent over time and adjust itself to the driver. The Imagimob Touch AI software is executed in the MCU in the steering wheel.