Date 01/31/22

Quantization of LSTM layers - a Technical White Paper

As announced before Christmas Imagimob supports quantization of LSTM layers. What does this mean and why is it important?

By Johan Malm, Imagimob, Ph.D and Product Owner

The Need for Quantization
LSTM stands for Long Short-Term Memory and is a vital part in Machine Learning when it comes to time-series data. This type of neural network building block has been used successfully in numerous state-of-the-art models that treat time-dependencies and can be credited with the success for solving the difficult problems in language modelling, translation, robotics, and gaming.

Quantization has to do with the actual implementation of the neural network on a digital computer and is important for the tiniest MCUs on the market that don’t have a floating-point unit (e.g., ARM’s M0-M3 cores).

Heuristic Approach to the Problem
LSTMs in particular are known to be hard to quantize due to their capability to remember features over of long sequences of inputs, and this capability can amplify small errors. In this article we look at a way to attack this tricky problem. We show how full integer post-training quantization works and how one can go about to solve dependencies between parameters using a heuristic approach.

Imagimob’s Solution
In Imagimob AI, quantization of a pretrained model is done with the click of a button in a user-friendly user interface, which generates C code without any floating-point operations, ready for deployment.

Please fill in the form below to download the white paper.


LATEST ARTICLES
arrow_forward
Date 04/20/22

Imagimob to exhibit at Embedded World 2022

Imagimob will exhibit at Embedded World 2022, that will be h...

Date 03/12/22

The past, present and future of edge ML

If you’ve noticed a surge of AI powered products and service...

Date 03/10/22

Recorded AI Tech Talk by Imagimob and Arm on April...

Date 03/05/22

The Future is Touchless: Radical Gesture Control P...

Date 01/31/22

Quantization of LSTM layers - a Technical White Pa...

Date 01/07/22

How to build an embedded AI application

Date 12/07/21

Don’t build your embedded AI pipeline from scratch...

Date 12/02/21

Imagimob @ CES 2022

Date 11/25/21

Imagimob AI in Agritech

Date 10/19/21

Deploying Edge AI Models - Acconeer example

Date 10/11/21

Imagimob AI used for condition monitoring of elect...

Date 09/21/21

Tips and Tricks for Better Edge AI models

Date 06/18/21

Imagimob AI integration with IAR Embedded Workbenc...

Date 05/10/21

Recorded Webinar - Imagimob at Arm AI Tech Talks o...

Date 04/23/21

Gesture Visualization in Imagimob Studio

Date 04/01/21

New team members

Date 03/15/21

Imagimob featured in Dagens Industri

Date 02/22/21

Customer Case Study: Increasing car safety through...

Date 12/18/20

Veoneer, Imagimob and Pionate in joint research pr...

Date 11/20/20

Edge computing needs Edge AI

Date 11/12/20

Imagimob video from tinyML Talks

Date 10/28/20

Agritech: Monitoring cattle with IoT and Edge AI

Date 10/19/20

Arm Community Blog: Imagimob - The fastest way fro...

Date 09/21/20

Imagimob video from Redeye AI seminar

Date 05/07/20

Webinar - Gesture control using radar and Edge AI

Date 04/08/20

tinyML article with Nordic Semiconductors

Date 02/18/20

What is tinyML?

Date 12/11/19

Edge AI for techies, updated December 11, 2019

Date 12/05/19

Article in Dagens Industri: This is how Stockholm-...

Date 09/06/19

The New Path to Better Edge AI Applications

Date 07/01/19

Edge Computing in Modern Agriculture

Date 04/07/19

Our Top 3 Highlights from Hannover Messe 2019

Date 03/26/19

The Way You Collect Data Can Make or Break Your Ne...

Date 03/23/18

AI Research and AI Safety

Date 03/11/18

What is Edge AI?

Date 01/30/18

Imagimob and Autoliv demo at CES 2018

Date 05/24/17

Wearing Intelligence On Your Sleeve

LOAD MORE keyboard_arrow_down