Machine learning possible on microcontrollers

ARM’s Zach Shelby introduced the use of microcontrollers for machine learning and artificial intelligence at the ECF19 event in Helsinki on last Friday. The talk showed that that artificial intelligence and machine learning can be applied to small embedded devices in addition to the cloud-based model. In particular, artificial intelligence is well suited to the devices of the Internet of Things. The use of machine learning in IoT is also sensible from an energy efficiency point of view if unnecessary power-consuming communication can be avoided (for example local keyword detection before sending voice data to cloud more more detailed analysis).

According to Shelby , we are now moving to a third wave of IoT that comes with comprehensive equipment security and voice control. In this model, machine learning techniques are one new application that can be added to previous work done on IoT.

In order to successfully use machine learning in small embedded devices, the problem to be solved is that it has reasonably little incoming information and a very limited number of possible outcomes. ARM Cortex M4 processor equipped with a DSP unit is powerful enough for simple hand writing decoding or detecting few spoken words with machine learning model. In examples the used machine learning models needed less than 100 kilobytes of memory.

zackdscf6473

The presentation can be now viewed on YouTube:

Important tools and projects mentioned on the presentation:

TinyML

TensorFlow Lite

uTensor (ARM MicroTensor)

TensorFlow Lite Micro

Articles on presentation:

https://www.uusiteknologia.fi/2019/05/20/ecf19-koneoppiminen-mahtuu-mikro-ohjaimeen/

http://www.etn.fi/index.php/72-ecf/9495-koneoppiminen-mullistaa-sulautetun-tekniikan

 

420 Comments

  1. Tomi Engdahl says:

    During this recent tinyML Foundation Talk, Manivannan Sivan explored the potential of embedded ML in Industry 4.0. His approach to predictive maintenance employs the Portenta H7 and Edge Impulse models to detect anomalous operation in industrial equipment like pumps, valves, and fans.
    https://www.youtube.com/watch?v=4K3D5Ano8VA

    Reply
  2. Tomi Engdahl says:

    EdenOff is a Nano 33 BLE Sense-based device that can be placed inside of a wall outlet to predict power outages using a tinyML model.

    This Arduino device can anticipate power outages with tinyML
    https://blog.arduino.cc/2022/05/24/this-arduino-device-can-anticipate-power-outages-with-tinyml/

    Bandini then deployed this model to a DIY setup by first connecting a Nano 33 BLE Sense with its onboard temperature sensor to an external ZMPT101B voltage sensor. Users can view the device in operation with its seven-segment display and hear the buzzer if a failure is detected. Lastly, the entire package is portable thanks to its LiPo battery and micro-USB charging circuitry.

    https://studio.edgeimpulse.com/public/90995/latest

    Reply
  3. Tomi Engdahl says:

    Run Machine Learning Code in an Embedded IoT Node to Easily Identify Objects
    https://www.digikey.com/en/articles/run-machine-learning-code-in-an-embedded-iot-node?dclid=CLy-84zUhvgCFZdNwgodJbAMmw

    Internet of Things (IoT) networks operating in dynamic environments are being expanded beyond object detection to include visual object identification in applications such as security, environmental monitoring, safety, and Industrial IoT (IIoT). As object identification is adaptive and involves using machine learning (ML) models, it is a complex field that can be difficult to learn from scratch and implement efficiently.

    The difficulty stems from the fact that an ML model is only as good as its data set, and once the correct data is acquired, the system must be properly trained to act upon it in order to be practical.

    This article will show developers how to implement Google’s TensorFlow Lite for Microcontrollers ML model into a Microchip Technology microcontroller. It will then explain how to use the image classification and object detection learning data sets with TensorFlow Lite to easily identify objects with a minimum of custom coding.

    It will then introduce a TensorFlow Lite ML starter kit from Adafruit Industries that can familiarize developers with the basics of ML.

    Reply
  4. Tomi Engdahl says:

    Predictive Maintenance Of Compressor Water Pumps
    https://hackaday.io/project/185930-predictive-maintenance-of-compressor-water-pumps

    Applying sensor fusion with RSL10 and Bosch sensors to run a TinyML model for predictive maintenance of compressor water pumps.

    Reply
  5. Tomi Engdahl says:

    Tic-Tac-Toe Game with TinyML-based Digit Recogniti
    https://hackaday.io/project/185957-tic-tac-toe-game-with-tinyml-based-digit-recogniti

    Play Tic-Tac-Toe (also known as Xs and Os) using handwritten digits recognized with the help of TinyML techniques.

    Reply
  6. Tomi Engdahl says:

    Edging Ahead When Learning On The Edge
    https://hackaday.com/2022/06/21/edging-ahead-when-learning-on-the-edge/

    “With the power of edge AI in the palm of your hand, your business will be unstoppable.”

    That’s what the marketing seems to read like for artificial intelligence companies. Everyone seems to have cloud-scale AI-powered business intelligence analytics at the edge. While sounding impressive, we’re not convinced that marketing mumbo jumbo means anything. But what does AI on edge devices look like these days?

    Reply
  7. Tomi Engdahl says:

    Detecting harmful gases with a single sensor and tinyML
    https://blog.arduino.cc/2022/07/11/detecting-harmful-gases-with-a-single-sensor-and-tinyml/

    Experiencing a chemical and/or gas leak can be potentially life-threatening to both people and the surrounding environment, which is why detecting them as quickly as possible is vital. But instead of relying on simple thresholds, Roni Bandini was able to come up with a system that can spot custom leaks by recognizing subtle changes in gas level values through machine learning.

    Reply
  8. Tomi Engdahl says:

    SparkFun Launches Arducam Pico4ML-Powered Machine Learning and AI Concept Kit
    New kit is designed to introduce core concepts while building practical projects using TensorFlow Lite and Edge Impulse Studio.
    https://www.hackster.io/news/sparkfun-launches-arducam-pico4ml-powered-machine-learning-and-ai-concept-kit-6c52884fab7d

    Reply
  9. Tomi Engdahl says:

    Run Machine Learning Code in an Embedded IoT Node to Easily Identify Objects
    https://www.digikey.com/en/articles/run-machine-learning-code-in-an-embedded-iot-node?dclid=CLyEmcbQwPkCFQbTGQodgJwPDw

    To run TensorFlow Lite for Microcontrollers, Microchip Technology is targeting machine learning in microcontrollers with the Arm® Cortex®-M4F-based ATSAMD51J19A-AFT microcontroller (Figure 1). It has 512 Kbytes of flash memory with 192 Kbytes of SRAM memory and runs at 120 megahertz (MHz). The ATSAMD51J19A-AFT is part of the Microchip Technology ATSAMD51 ML microcontroller family. It is compliant with automotive AEC-Q100 Grade 1 quality standards and operates over -40°C to +125°C, making it applicable for the harshest IoT and IIoT environments. It is a low-voltage microcontroller and operates from 1.71 to 3.63 volts when running at 120 MHz.

    Reply
  10. Tomi Engdahl says:

    Seeed Studio Launches Tiny Edge AI Grove Vision AI Module, Promises Edge Impulse Support to Come
    Designed around an Omnivision camera sensor and a Himax low-power tinyML processor, this compact board runs machine learning on-device.
    https://www.hackster.io/news/seeed-studio-launches-tiny-edge-ai-grove-vision-ai-module-promises-edge-impulse-support-to-come-17115f3bddf0

    Reply
  11. Tomi Engdahl says:

    This proof of concept pen can recognize letters written in thin air using an Edge Impulse tinyML model on the Nano RP2040 Connect.

    Air-writing TinyML Alphabet Recognition
    A proof-of-concept project to recognize air-written alphabet, using Arduino Nano RP2040 Connect run with Edge Impulse tinyML model.
    https://www.hackster.io/JuanYi/air-writing-tinyml-alphabet-recognition-12640b

    Reply
  12. Tomi Engdahl says:

    “But combining tinyML and Arduino hardware can potentially provide very low-cost embedded AI solutions.”

    New Uses For AI In Chips
    https://semiengineering.com/new-uses-for-ai-in-chips/

    ML/DL is increasing design complexity at the edge, but it’s also adding new options for improving power and performance

    Reply
  13. Tomi Engdahl says:

    TinyMaix is a lightweight machine learning library for microcontrollers
    https://www.cnx-software.com/2022/08/17/tinymaix-lightweight-machine-learning-library-for-microcontrollers/

    Sipeed TinyMaix open-source machine learning library is designed for microcontrollers, and lightweight enough to run on a Microchip ATmega328 MCU found in the Arduino UNO board and its many clones.

    Developed during a weekend hackathon, the core code of TinyMax is about 400 lines long, with a binary size of about 3KB, and low RAM usage, enabling it to run the MNIST handwritten digit classification on an ATmega320 MCU with just 2KB SRAM and 32KB flash.

    Reply
  14. Tomi Engdahl says:

    Voice controlled AI Mood Lamp
    Give your microcontroller the ability to hear you and set the right mood lighting.
    https://www.hackster.io/mcmchris/voice-controlled-ai-mood-lamp-5917ca

    Reply
  15. Tomi Engdahl says:

    The Infineon Technologies AG team added the power of ML to their TLE493D-W2B6 3D magnetic sensor-based joystick to bring your gaming experience to the next level.
    https://www.hackster.io/Infineon_Team/ai-powered-joystick-907a1f

    Reply
  16. Tomi Engdahl says:

    Using CircuitPython, this compact machine learning project takes camera input and looks for digits to classify.

    Ashish Patil’s TinyML Raspberry Pi Pico Project Handles Handwritten Digit Recognition On-Device
    https://www.hackster.io/news/ashish-patil-s-tinyml-raspberry-pi-pico-project-handles-handwritten-digit-recognition-on-device-d22373eee680

    Using CircuitPython, this compact machine learning project takes camera input and looks for digits to classify.

    Software developer Ashish Patil has put together a step-by-step guide to computer vision work on the Raspberry Pi Pico, processing the image from a camera sensor to recognize handwritten digits.

    “[This is] a project using Raspberry Pi Pico, an [Omnivision] OV7670 camera module, a 120×160 TFT LCD display, and machine learning” Patil explains of his creation, “to build a portable handwritten digit classification system. [It] analyzes photos received from a camera and tries to infer what digit was present in the image.”

    Reply
  17. Tomi Engdahl says:

    Thanks to tinyML on the Nano 33 BLE Sense, Tauno Erik’s artwork can recognize and respond whenever someone takes a picture of it with their phone.

    This piece of art knows when it’s being photographed thanks to tinyML
    https://blog.arduino.cc/2022/09/09/this-piece-of-art-knows-when-its-being-photographed-thanks-to-tinyml/

    Nearly all art functions in just a single direction by allowing the viewer to admire its beauty, creativity, and construction. But Estonian artist Tauno Erik has done something a bit different thanks to embedded hardware and the power of tinyML. His work is able to actively respond to a person whenever they bring up a cell phone to take a picture of it.

    At the center are four primary circuits/components, which include a large speaker, an abstract LED sculpture, an old Soviet-style doorbell board, and a PCB housing the control electronics. The circuit contains an Arduino Nano 33 BLE Sense along with an OV7670 camera module

    Tauno then trained a machine learning model with the help of Edge Impulse on almost 700 images that were labeled as human-containing, cell phone, or everything else/indeterminate.

    With the model trained and deployed to the Nano 33 BLE Sense, a program was written that grabs a frame from the camera, converts its color space to 24-bit RGB, and sends it to the model for inferencing. The resulting label can then be used to activate the connected doorbell and play various animations on the LED sculpture.

    https://taunoerik.art/2022/09/04/artwork-that-knows-when-you-are-taking-a-picture-of-it/

    Reply
  18. Tomi Engdahl says:

    DeepPicarMicro Crams NVIDIA’s PilotNet Autonomous Vehicle Neural Network Into a Raspberry Pi Pico
    Clever optimization approaches take a CNN model designed for high-end GPUs and run it on the low-cost RP2040.
    https://www.hackster.io/news/deeppicarmicro-crams-nvidia-s-pilotnet-autonomous-vehicle-neural-network-into-a-raspberry-pi-pico-2a0b8a38e18e

    trio of scientists from the University of Kansas have published a paper on DeepPicarMicro, an autonomous vehicle testbed, which crams a fully-functional convolutional neural network (CNN) onto a Raspberry Pi Pico microcontroller board.

    “Running deep neural networks (DNNs) on tiny Microcontroller Units (MCUs) is challenging due to their limitations in computing, memory, and storage capacity,” the team admits. “Fortunately, recent advances in both MCU hardware and machine learning software frameworks make it possible to run fairly complex neural networks on modern MCUs, resulting in a new field of study widely known as tinyML. However, there have been few studies to show the potential for tinyML applications in cyber physical systems (CPS).”

    Reply
  19. Tomi Engdahl says:

    Based on a Nano 33 BLE Sense, this small tracker uses machine learning to monitor packages in transit for improper handling.

    tinyML device monitors packages for damage while in transit
    https://blog.arduino.cc/2022/09/10/this-tinyml-device-monitors-packages-for-damage-while-in-transit/

    Reply
  20. Tomi Engdahl says:

    Designed to improve safety in the workplace, this machine learning system is capable of detecting and reporting falls so that help can be dispatched without delay.

    Early detection of workers falls with Machine Learning
    https://www.hackster.io/roni-bandini/early-detection-of-workers-falls-with-machine-learning-c3502c

    Fall detection prototype with Arduino clients, Raspberry Pi server and Machine Learning.

    Reply
  21. Tomi Engdahl says:

    DeepPicarMicro Crams NVIDIA’s PilotNet Autonomous Vehicle Neural Network Into a Raspberry Pi Pico
    Clever optimization approaches take a CNN model designed for high-end GPUs and run it on the low-cost RP2040.
    https://www.hackster.io/news/deeppicarmicro-crams-nvidia-s-pilotnet-autonomous-vehicle-neural-network-into-a-raspberry-pi-pico-2a0b8a38e18e

    Reply
  22. Tomi Engdahl says:

    Making Perfect Toast with an AI Nose
    Shawn Hymel took advantage of machine learning to build a toaster that always makes perfect toast.
    https://www.hackster.io/news/making-perfect-toast-with-an-ai-nose-78e1ccc2626e

    Reply
  23. Tomi Engdahl says:

    IoT AI-driven Tree Disease Identifier w/ MMS
    https://hackaday.io/project/187720-iot-ai-driven-tree-disease-identifier-w-mms

    Detect tree diseases and get informed of the results via MMS to prevent them from spreading and harming forests, farms, and arable lands.

    Reply
  24. Tomi Engdahl says:

    POLYN’s Upcoming NASP Neuromorphic TinyML Chips Get Voice Extraction Capabilities with NeuroVoice
    Drawing just 100µW of power and small enough for in-ear earbud use, this tinyML chip family can pull clear speech from the noisiest feeds.
    https://www.hackster.io/news/polyn-s-upcoming-nasp-neuromorphic-tinyml-chips-get-voice-extraction-capabilities-with-neurovoice-651e76456218

    Reply
  25. Tomi Engdahl says:

    Smart Bee Design’s Bee Motion S3 Is a Neat, All-in-One IoT Motion Sensor with STEMMA QT Expansion
    Designed for long-term operation on battery power, this compact board packs a tinyML-capable dual-core processor and PIR sensor.
    https://www.hackster.io/news/smart-bee-design-s-bee-motion-s3-is-a-neat-all-in-one-iot-motion-sensor-with-stemma-qt-expansion-4746dbad31ec

    Reply
  26. Tomi Engdahl says:

    MIT boffins cram ML training into microcontroller memory
    Neat algorithmic trick squeezing into 256KB of RAM, barely enough for inference let alone teaching
    https://www.theregister.com/2022/10/05/microcontroller_ml_training/

    Researchers claim to have developed techniques to enable the training of a machine learning model using less than a quarter of a megabyte of memory, making it suitable for operation in microcontrollers and other edge hardware with limited resources.

    The researchers at MIT and the MIT-IBM Watson AI Lab say they have found “algorithmic solutions” that make the training process more efficient and less memory-intensive.

    Reply
  27. Tomi Engdahl says:

    Power outages anticipation device ready for Maker Faire Rome © MIT
    Edenoff, a TinyML Arduino device was modified for secure and interactive demos at Maker Faire Rome 2022
    https://create.arduino.cc/projecthub/roni-bandini/power-outages-anticipation-device-ready-for-maker-faire-rome-277e05

    Reply
  28. Tomi Engdahl says:

    Seeed Studio’s SenseCAP A1101 Aims at a 10-Minute Start for Industrial-Grade TinyML Computer Vision
    https://www.hackster.io/news/seeed-studio-s-sensecap-a1101-aims-at-a-10-minute-start-for-industrial-grade-tinyml-computer-vision-e1befa89e7d7

    Part of the company’s industrial-grade SenseCAP family, the A1101 aims for ultra-low-power on-device computer vision with LoRaWAN support.

    Reply
  29. Tomi Engdahl says:

    This bike suspension system uses tinyML to automatically adjust itself in real-time depending on the terrain.

    Smart Bike Suspension © GPL3+
    https://create.arduino.cc/projecthub/jallsonsuryo/smart-bike-suspension-7da023

    An automatic suspension adjustment on a bicycle that able to understand the character of the terrain and the activities of the rider.

    Reply
  30. Tomi Engdahl says:

    TinyRL is a framework for developing reinforcement learning algorithms capable of running on resource-constrained hardware platforms.

    Teaching the Littlest Learners
    https://www.hackster.io/news/teaching-the-littlest-learners-84722d261b1d

    TinyRL is a framework for developing reinforcement learning algorithms capable of running on resource-constrained hardware platforms.

    TinyML is getting a lot of attention lately because it allows for the development of very small, low-power machine learning models that can be deployed on resource-constrained devices. By eliminating the large computational and energy requirements of traditional machine learning models that run in the cloud, costs can be slashed dramatically, and issues related to data privacy and latency can also be sidestepped. This opens up a wide range of intelligent algorithms to be used in countless additional use cases.
    The software component of this problem is being attacked from many fronts — TensorFlow Lite for Microcontrollers and the Edge Impulse machine learning development platform, for example, produce highly optimized models that can run on very small hardware platforms. A team centered at Newcastle University has recently thrown their hat in the ring with a technique designed to make one particular type of algorithm easier to deploy to resource-constrained devices — reinforcement learning. Reinforcement learning is in many ways especially difficult to bring to tiny hardware platforms because not only does the algorithm need to run inferences, but it also needs to continually learn and adapt itself to new situations by collecting data over time. The framework, called TinyRL, was demonstrated using several common, low-power microcontroller development boards.

    In a nutshell, TinyRL works by first running the initial algorithm training process on a computing platform with sufficient resources to run high-level programming environments to streamline the development process. In most cases, a Raspberry Pi single-board computer would suffice for this purpose. Keeping in mind that the ultimate destination for these algorithms is a microcontroller, the team chose to focus on using Q-Learning.

    TinyRL was evaluated with the help of a virtual environment called the OpenAI Gym. The Cart Pole and Mountain Car virtual environments from the classic control group were selected. The learning process was carried out on a computer with an Intel i7 processor and 16GB of RAM, then the Q-arrays were transferred to a number of different platforms with the help of the Arduino IDE. The NUCLEO-L031K6, NUCLEO-L432KC, Arduino Nano 33 BLE Sense, Raspberry Pi Pico, and ESP32-CAM-MB boards were all evaluated.

    The slowest algorithm run time on any platform was 261 microseconds, and the best result clocked in at 2 microseconds.

    The researchers were able to prove that TinyRL is an effective method for implementing reinforcement learning algorithms on resource-constrained platforms.

    Reply
  31. Tomi Engdahl says:

    The MinUn TinyML Framework Squeezes Machine Learning Models Onto Resource-Light Microcontrollers
    The first “holistic” machine learning framework, MinUn aims to beat TensorFlow Lite and others on size and accuracy.
    https://www.hackster.io/news/the-minun-tinyml-framework-squeezes-machine-learning-models-onto-resource-light-microcontrollers-07d581406e4c

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*