To be able to process the sensor data in real time, the machine learning model needs to run locally on the chip close to the sensor itself - usually called “the edge”. In this paper we will go through how an AI application can be created - from the initial data collection to the final embedded application. As an example we will demonstrate a joint project between Imagimob and the radar technology company Acconeer.
Imagimob and Acconeer teamed up in 2019. Both companies focus on providing powerful applications on hardware devices where low power consumption is key.
The goal for this project was to create an embedded application that could classify five different hand gestures in real time using radar data. With its small size the radar could be put inside a pair of earphones and the gestures would work as virtual buttons to steer the functionality that is usually programmed into physical buttons. The end product for the project was decided to be a robust live demo at CES 2020 in Las Vegas.
By reading this white paper, you will get a better understanding on how to prepare, plan and execute projects to develop embedded AI applications. You will also get an insight in how Imagimob AI can speed up the development process and improve the quality of the applications.
Please fill in the form below and we will send you the white paper.
We will be presenting or exhibiting at numerous events in the coming period. Stay tuned for more eve...