
By Tech Dr Alex Jonsson
How to sample high-resolution biometric data, use low-powered, long-range networks (LPWAN) and still achieve high-quality results while powered by trickle-feed battery power alone? By using Edge AI (aka tinyML) is the answer, so here's how Imagimob implemented an unsupervised training model, namely a GMM solution (Gaussian Mixture Model) in order to do just that!
Introduction
Within the AFarCloud project [1], Imagimob has since two years had the privilege to work with some of the most prominent organisations and kick-ass scientists within precision agriculture; Universidad Politécnica de Madrid, Division of Intelligent Future Technologies at MDH, Charles University in Prague and RISE Research Institutes of Sweden, to create an end-to-end service for real-time monitoring the heath and well-being of cattle.
Indoors, foremost for milking-giving bovine, there are today methods commercially available which utilise combinations of multisensory (camera), radio beacons and specialist equipment like ruminal probes, allowing farmers to keep their livestock mostly indoors under continuous surveillance. While in many European countries, much of the beef cattle are held in outside environments, often large pastures spanning over hundreds or thousands of hectares, making short-range technologies requiring high data rates and networking range challenging for aggregating sensor data, as access to power lines and electrical outlets is scarce, and most equipment are battery-powered only.
With a small form factor worn around the neck, a sealed enclosure crams all of a 32-bit microcomputer, 9-axis movement sensors from Bosch sampling at full 50Hz speed, a long-range radio and a 40-gram Tesla-style battery cell. The magic comes in when all of the analysis is performed on the very same microcomputer then and there at the edge - literally on-cow AI - rather than the traditional transfer of raw data to a cloud for processing and storage. Every hour, a small data set containing a refined result, is sent over the airwaves, allowing the farmer to see how much of the animal's time has gone into feeding, ruminating, resting, romping around et cetera, in the form of an activity pie chart. By shrivelling many animals, the individual bovine can be compared both to its peers, as well as activity over time. Periods of being in heat (fertility), are important for the farmer to monitor as cows only give milk after becoming pregnant [2]. Cows in heat are amongst other things found more restless and alert, rather standing when the rest of the herd are laying down resting, fondle one another and produce frequent bellowing.
Training & AI models
As we're using Imagimob AI SaaS and desktop software inhouse too, where a lot of the heavy lifting is taken care of once you have a H5 file (Hierarchical Data Format, with multi-dimensional scientific data sets). The tricky part is to gather data from the field. For this purpose, another device is used which basically contains the very same movement sensors, e.g. the Bosch BMA 280, 456 or 490L chip, an SD card, a clock chip and a battery allowing weeks of continuous measurements. Some of the time, cows were videotaped, allowing us to line-up sensor readings and the goings-in in the meadows. Using the Imagimob capture system, activities are labelled up for the training phase.