Machine learning possible on microcontrollers

ARM’s Zach Shelby introduced the use of microcontrollers for machine learning and artificial intelligence at the ECF19 event in Helsinki on last Friday. The talk showed that that artificial intelligence and machine learning can be applied to small embedded devices in addition to the cloud-based model. In particular, artificial intelligence is well suited to the devices of the Internet of Things. The use of machine learning in IoT is also sensible from an energy efficiency point of view if unnecessary power-consuming communication can be avoided (for example local keyword detection before sending voice data to cloud more more detailed analysis).

According to Shelby , we are now moving to a third wave of IoT that comes with comprehensive equipment security and voice control. In this model, machine learning techniques are one new application that can be added to previous work done on IoT.

In order to successfully use machine learning in small embedded devices, the problem to be solved is that it has reasonably little incoming information and a very limited number of possible outcomes. ARM Cortex M4 processor equipped with a DSP unit is powerful enough for simple hand writing decoding or detecting few spoken words with machine learning model. In examples the used machine learning models needed less than 100 kilobytes of memory.

zackdscf6473

The presentation can be now viewed on YouTube:

Important tools and projects mentioned on the presentation:

TinyML

TensorFlow Lite

uTensor (ARM MicroTensor)

TensorFlow Lite Micro

Articles on presentation:

https://www.uusiteknologia.fi/2019/05/20/ecf19-koneoppiminen-mahtuu-mikro-ohjaimeen/

http://www.etn.fi/index.php/72-ecf/9495-koneoppiminen-mullistaa-sulautetun-tekniikan

 

420 Comments

  1. Tomi Engdahl says:

    Developer Ethan Dell has released a video showcasing how to get started using TensorFlow Lite to run a pose estimation model on a Raspberry Pi 4, building on earlier work by Evan Juraes.

    Ethan Dell Demonstrates TensorFlow Lite Pose Detection on a Raspberry Pi 4 with GPIO Trigger Board
    https://www.hackster.io/news/ethan-dell-demonstrates-tensorflow-lite-pose-detection-on-a-raspberry-pi-4-with-gpio-trigger-board-5ce69fd1404f

    Building on earlier work by Evan Juraes, Dell shows how easy it is to get started with TensorFlow Lite computer vision on the Raspberry Pi.

    Reply
  2. Tomi Engdahl says:

    In yesterday’s tinyML Talks webcast, Arm engineers Felix Johnny and Fredrik Knutsson explored the performance optimization benefits of using CMSIS-NN with TensorFlow Lite on a Nano 33 BLE Sense.

    https://youtu.be/IggaCzJYcOA

    Reply
  3. Tomi Engdahl says:

    Accelerated inference on Arm microcontrollers with TensorFlow Lite for Microcontrollers and CMSIS-NN
    https://blog.tensorflow.org/2021/02/accelerated-inference-on-arm-microcontrollers-with-tensorflow-lite.html

    Microcontrollers (MCUs) are the tiny computers that power our technological environment. There are over 30 billion of them manufactured every year, embedded in everything from household appliances to fitness trackers. If you’re in a house right now, there are dozens of microcontrollers all around you. If you drive a car, there are dozens riding with you on every drive. Using TensorFlow Lite for Microcontrollers (TFLM), developers can deploy TensorFlow models to many of these devices, enabling entirely new forms of on-device intelligence.

    Reply
  4. Tomi Engdahl says:

    This project runs a TensorFlow Lite micro speech model on a Nano 33 BLE to detect specific commands and turn on a different colored LEDs to emulate traffic lights.

    https://create.arduino.cc/projecthub/datasciencelearningnerd/controlling-traffic-lights-with-micro-speech-095258

    Reply
  5. Tomi Engdahl says:

    Accelerated inference on Arm microcontrollers with TensorFlow Lite for Microcontrollers and CMSIS-NN
    https://blog.tensorflow.org/2021/02/accelerated-inference-on-arm-microcontrollers-with-tensorflow-lite.html

    Reply
  6. Tomi Engdahl says:

    The Embedded Machine Learning Revolution: The Basics You Need To Know
    https://www.wevolver.com/article/the-embedded-machine-learning-revolution-the-basics-you-need-to-know

    How Embedded Machine Learning unblocks the potential of the currently unused enterprise data and enables a new wave of real-time, sustainable, and ROI generating applications.

    Reply
  7. Tomi Engdahl says:

    Face Tracker Using OpenCV and Arduino © GPL3+
    Track your face using OpenCV’s facial recognition.
    https://create.arduino.cc/projecthub/shubhamsantosh99/face-tracker-using-opencv-and-arduino-55412e

    Reply
  8. Tomi Engdahl says:

    TinyML Water Sensor – Based on Edge Impulse & Arduino Sense © MIT
    TinyML implementation to identify running water tap sound and once heard one, a buzzer + LED timer is triggered.
    https://create.arduino.cc/projecthub/enzo2/tinyml-water-sensor-based-on-edge-impulse-arduino-sense-f8b133

    Reply
  9. Tomi Engdahl says:

    Get Hands-On Machine Learning Experience with Raspberry Pi and Google Alto
    Google has a designed an affordable “teachable object” called Alto that makes on-device machine learning more approachable.
    https://www.hackster.io/news/get-hands-on-machine-learning-experience-with-raspberry-pi-and-google-alto-b912f26c5183

    Reply
  10. Tomi Engdahl says:

    Arm’s Alessandro Grande shows how to create a person detection application with the Portenta H7 and Vision Shield running Mbed OS and TensorFlow Lite for Microcontrollers.

    TinyML person detection project with Arduino, TensorFlow and Mbed OS
    https://m.youtube.com/watch?v=pCnqCfTJxFg

    Reply
  11. Tomi Engdahl says:

    When this Nano 33 BLE Sense device hears running water, it triggers a buzzer and LED timer to encourage children to wash their hands for 20 seconds.

    TinyML Water Sensor – Based on Edge Impulse & Arduino Sense © MIT
    https://create.arduino.cc/projecthub/enzo2/tinyml-water-sensor-based-on-edge-impulse-arduino-sense-f8b133

    TinyML implementation to identify running water tap sound and once heard one, a buzzer + LED timer is triggered.

    Reply
  12. Tomi Engdahl says:

    This Toolkit Lets Users Easily Capture Audio Data for Building TinyML Models
    https://www.hackster.io/news/this-toolkit-lets-users-easily-capture-audio-data-for-building-tinyml-models-f4eec0d0233e

    See how this project incorporates a more streamlined way to accurately capture, process, and predict data using tinyML and a microphone.

    Reply
  13. Tomi Engdahl says:

    Calculating Reading Time with TinyML and Arduino Nano 33 BLE © MIT
    Do you want to know remaining reading time? This little TinyML Arduino device was trained to do exactly that.
    https://create.arduino.cc/projecthub/roni-bandini/calculating-reading-time-with-tinyml-and-arduino-nano-33-ble-121ac6

    Reply
  14. Tomi Engdahl says:

    “Arduino offers an affordable, complete and out-of-the-box experience that’s simply hard to beat. Pushing your workloads to the edge removes the need for costly, power-hungry and processor-intensive hardware.”

    4 Ways Arduino Portenta Is Winning Industry 4.0
    https://www.edgeimpulse.com/blog/4-ways-the-new-arduino-portenta-is-winning-manufacturing-40

    For manufacturing, the combination of embedded machine learning and Internet of Things devices is a game-changer at every level. From materials to production, shipping and receiving, the next five years will be driven by connected, energy-efficient intelligent devices that can do more. A lot more. IoT connectivity, coupled with embedded ML, will not only create better automation but will also give life to a whole new class of business processes. And surprisingly, it is going to get a lot simpler to program, update, and manage. With better brains and learning capabilities, powered by ML at the edge, IoT is finally going to live up to its original expectations and transform manufacturing and other industries from the ground and up.

    1. Simplicity wins. Gone are the days of “it’s just complicated.”

    2. Lowering cost. Get used to affordable IoT that is smarter and better.

    3. Quality rules. Arduino Portenta with embedded ML delivers precision.

    4. Your way. Embedded ML with off-the-shelf hardware is now limitless.

    The Arduino Portenta with Edge Impulse is a game-changing solution, bringing intelligence to the device, your way. Start building your first Arduino Portenta models today to discover the new, unconventionally simple IoT platform.

    Reply
  15. Tomi Engdahl says:

    CurrentSense-TinyML is a proof of concept for detecting microcontroller behavior using a current sensor and tinyML: bit.ly/2Pn0h3f

    Reply
  16. Tomi Engdahl says:

    Killing Flies with Artificially Intelligent Hammers
    TinyML may be able to tell you what an unknown microcontroller is doing by snooping on the amount of current it draws.
    https://www.hackster.io/news/killing-flies-with-artificially-intelligent-hammers-3871b674a7cc

    Reply
  17. Tomi Engdahl says:

    Impressive work,Manivannan!
    This pocket-sized, tinyML-powered ECG analyzer device is capable of accurately diagnosing heart diseases without any dependency on the Internet.

    ECG Analyzer Powered by Edge Impulse © GPL3+
    https://create.arduino.cc/projecthub/manivannan/ecg-analyzer-powered-by-edge-impulse-24a6c2

    A TinyML based Medical device powered by Edge Impulse to predict Atrial fibrillation, AV Block 1 and Normal ECG with >90%.

    Reply
  18. Tomi Engdahl says:

    Use the Arduino KNN library with the Nano 33 BLE Sense to classify the color of whatever Easter M&M (or jelly bean?) you throw at it.

    Classify Candy in Free Fall Using TinyML © CC BY
    https://create.arduino.cc/projecthub/8bitkick/classify-candy-in-free-fall-using-tinyml-2836bf

    Using the Arduino KNN library to classify the color of M&Ms we throw at it.

    Reply
  19. Tomi Engdahl says:

    https://www.facebook.com/156088694417458/posts/6152041811488753/

    During his tinyML Summit workshop, Pete Warden showed attendees how to train and deploy an IMU-based model for recognizing gestures on the Nano BLE Sense 33. Now you can follow along, too!

    tinyML Summit 2021 Pete Warden Tutorial: Building a Magic Wand
    https://m.youtube.com/watch?v=vKRdQHO7tIY

    Reply
  20. Tomi Engdahl says:

    Project Showcase: Machine Learning with RP2040
    https://www.youtube.com/watch?v=YGwm-TESdHA

    The RP2040 is supported with both C/C++ and MicroPython cross-platform development environments, including easy access to runtime debugging. It has UF2 boot and floating-point routines baked into the chip. The built-in USB can act as both device and host. It has two symmetric cores and high internal bandwidth, making it useful for signal processing and video. While the chip has a large amount of internal RAM, the board includes an additional 16MB external QSPI flash chip to store program code.

    Reply
  21. Tomi Engdahl says:

    Embedded machine learning on Raspberry Pi just became even more accessible thanks to Edge Impulse.

    Behold! Edge Impulse and TinyML on Raspberry Pi
    https://www.hackster.io/news/behold-edge-impulse-and-tinyml-on-raspberry-pi-c3b786fa69b6

    Embedded machine learning on Raspberry Pi just became even more accessible thanks to Edge Impulse.

    Today we are excited to announce our foray into embedded Linux with official support for the Raspberry Pi 4!

    Now, users of Edge Impulse can select the right processor class for their embedded machine learning applications. Leverage our existing best-in-class support for low-power MCUs or venture into processor classes that run embedded Linux if highest performance is the objective.

    For audio applications, plug a standard USB microphone into one of the available USB slots on the Pi. For sensor fusion, the 40-pin GPIO header on the Pi can be employed to connect to your favorite sensors as well.

    SDKs for Python, Node.js, Go, and C++ are provided so you can easily build your own custom apps for inferencing.

    Reply
  22. Tomi Engdahl says:

    Because in the words of Alasdair Allan, “The importance of bananas to machine learning researchers cannot be overstated!”

    In Arduino’s latest Pro tutorial, Sebastian Romero shows how to train a custom model using Edge Impulse and run it on a Portenta H7 + Vision Shield to detect different types of fruit: https://bit.ly/3dVeq0y

    Reply
  23. Tomi Engdahl says:

    Use tinyML on a Nano 33 BLE Sense to classify and predict a driver’s braking patterns.

    Driver Braking Analyzer – A TinyML Model Using Edge Impulse © GPL3+
    https://create.arduino.cc/projecthub/manivannan/driver-braking-analyzer-a-tinyml-model-using-edge-impulse-b1f749

    A TinyML model will predict the driver braking pattern in speed breakers and potholes to evaluate the effective driving skills.

    Reply
  24. Tomi Engdahl says:

    Visual Raspberry Pi With Node-Red And TensorFlow
    https://hackaday.com/2021/04/22/visual-raspberry-pi-with-node-red-and-tensorflow/

    If you prefer to draw boxes instead of writing code, you may have tried IBM’s Node-RED to create logic with drag-and-drop flows. A recent [TensorFlow] video shows an interview between [Jason Mayes] and [Paul Van Eck] about using TensorFlow.js with Node-RED to create machine learning applications for Raspberry Pi visually. You can see the video, below.

    https://www.youtube.com/watch?v=cZj1d25eeWY&feature=emb_logo

    Reply
  25. Tomi Engdahl says:

    Mini golf + Connect Four = Connect Fore!

    Playing Connect Four against a mini golfing AI opponent
    https://blog.arduino.cc/2021/04/27/playing-connect-four-against-a-mini-golfing-ai-opponent/

    Have you dreamed of combining the two incredible activities putt-putt and Connect Four together into the same game? Well one daring maker set out to do just that. Bithead’s innovative design involves a mini golf surface with seven holes at the end corresponding to the columns. The system can keep track of where each golf ball is with an array of 42 color sensors that are each connected to one of seven I2C multiplexers, all leading to a single Arduino Uno.

    The player can select from six distinct levels of AI, all the way from random shots in the dark to Q Learning, which records previous game-winning moves to improve how it plays over time. It can putt by first loading a golf ball into a chamber and then spinning up a pair of high-RPM motors that launch it. For the human player, there is a pair of dispensers on the left that give the correct color of ball.

    Reply
  26. Tomi Engdahl says:

    “Countless products and applications can be made 10 to 100 times smarter and more useful in a way that hasn’t been considered technologically or commercially feasible before.”

    Learn why Nordic Semiconductor CTO Svein-Egil Nielsen thinks tinyML is such a big deal.

    Democratizing machine learning will transform IoT
    https://internetofthingsagenda.techtarget.com/blog/IoT-Agenda/Democratizing-machine-learning-will-transform-IoT

    Even a few years ago, running machine learning on the battery-powered, wireless devices typical of IoT was extremely difficult for an embedded engineer to do. One of the main problems was machine learning required a PhD computer and data science background.

    Today these restrictions no longer exist thanks to advances in both wireless IoT chips and a scaled-down version of machine learning called tiny machine learning or TinyML for short.

    Put simply: Machine learning has become accessible to all in wireless IoT because of abstracting the machine learning complexity away from the end user, such as through graphical data representations.

    Reply
  27. Tomi Engdahl says:

    Never Stop Learning
    TinyML can add smarts to microcontrollers, but online learning is needed to keep them smart.
    https://www.hackster.io/news/never-stop-learning-21a750d7508d

    If it seems like you are hearing about tinyML every time you turn around these days, that is probably because, well, you are hearing about tinyML every time you turn around. And rightly so — the democratization of machine learning and artificial intelligence is a big deal. In addition to making the technologies more accessible, tinyML also has implications for protecting privacy, reducing application latency, and improving energy efficiency.

    Reply
  28. Tomi Engdahl says:

    Eloquent Arduino’s Benchmarking Puts the Raspberry Pi Pico Bottom of the Pile for TinyML Performance
    Its low cost and feature set have made the Raspberry Pi Pico a popular device, but benchmarking shows it’s not great for TinyML.
    https://www.hackster.io/news/safety-and-security-in-medical-devices-1195f915ae8a

    Reply
  29. Tomi Engdahl says:

    Edge Impulse Raises $15M for ML Development Platform
    ML development platform aims to democratize AI, making technology accessible to billions of edge and IoT devices

    Edge Impulse Raises $15M for ML Development Platform
    https://www.eetimes.com/edge-impulse-raises-15m-for-ml-development-platform/

    Reply
  30. Tomi Engdahl says:

    Starting out with Machine Learning, Arduino and TensorFlow. It is another Blink example, but different! Information and links can be found in the youtube “Show More” https://www.youtube.com/watch?v=6EklKMHWidU

    Reply
  31. Tomi Engdahl says:

    The TensorFlow Lite for Microcontrollers Experiments Collection features work by developers who using Arduino and TFLite Micro to create awesome experiences and tools.

    Need inspiration for your next tinyML project? Check it out!

    TensorFlow Lite for Microcontrollers
    https://experiments.withgoogle.com/collection/tfliteformicrocontrollers

    Reply
  32. Tomi Engdahl says:

    The After Eight Step is an Arduino and Max-powered eight-step sequencer with ML functionality — housed inside an After Eight mints tin, of course!

    https://m.youtube.com/watch?v=Zxq5JtLpzR0

    Reply
  33. Tomi Engdahl says:

    Researchers managed to cram a machine learning algorithm capable of recognizing the handwritten digits within the MNIST dataset onto an Arduino Uno’s 2KB RAM.

    Recognizing handwritten MNIST digits on an Arduino Uno using LogNNet
    https://blog.arduino.cc/2021/05/19/recognizing-handwritten-mnist-digits-on-an-arduino-uno-using-lognnet/

    The Arduino Uno is famous for its ease of use and compact size, yet its microcontroller, the ATmega328P, is quite small. The 328P contains a mere 32KB of flash storage for programs and 2KB of RAM, which has traditionally made it unsuitable for machine learning applications. However, a team at the Institute of Physics and Technology at Petrozavodsk State University was able to cram an algorithm that can recognize the handwritten digits within the MNIST dataset. Without getting too complicated, the Uno takes in an array of pixels that range in value from 0 to 255, or one byte. The entire 28 by 28 grid is then flattened to a single array of 784 elements that is passed into a reservoir that holds the weights for each pixel.

    As the model continues to get trained, these weights are gradually adjusted until the output matches the correct digit.

    Input data is read from the serial port and stored within an array, where it is then used within the LogNNet library to compute the layer values. Once everything has been passed through the neural network the resulting digit is printed to the serial monitor. Overall, the neural network’s variables in RAM are quite space-efficient and account for just over a kilobyte.

    were able to achieve an accuracy of 82% with an inferencing time of around seven seconds, which is quite impressive for such a small chip.

    Reply
  34. Tomi Engdahl says:

    Did you miss Pete Warden’s #GoogleIO tinyML workshop? Not to worry, the recording is now up! Watch and learn how to train machine learning models using a Nano 33 BLE Sense and Web BLE.

    https://m.youtube.com/watch?time_continue=3449&v=jqVCR2MUJEs&feature=emb_title

    Reply
  35. Tomi Engdahl says:

    Google is challenging developers to create projects that push boundaries, spark joy, and show off the helpfulness of TFLite Micro for a chance to win $2,500, a session with the TensorFlow team, and more!

    Hurry and enter, the first 750 submissions will also receive a custom TensorFlow Lite for Microcontrollers Kit featuring the Nano 33 BLE Sense.

    The TensorFlow Microcontroller Challenge
    https://experiments.withgoogle.com/tfmicrochallenge

    We challenge you to create a project that pushes boundaries, sparks joy, and shows off the helpfulness of TensorFlow Lite for Microcontrollers.

    Reply
  36. Tomi Engdahl says:

    Speech Recognition On An Arduino Nano?
    https://hackaday.com/2021/05/26/speech-recognition-on-an-arduino-nano/

    Like most of us, [Peter] had a bit of extra time on his hands during quarantine and decided to take a look back at speech recognition technology in the 1970s. Quickly, he started thinking to himself, “Hmm…I wonder if I could do this with an Arduino Nano?” We’ve all probably had similar thoughts, but [Peter] really put his theory to the test.

    The hardware itself is pretty straightforward. There is an Arduino Nano to run the speech recognition algorithm and a MAX9814 microphone amplifier to capture the voice commands. However, the beauty of [Peter’s] approach, lies in his software implementation. [Peter] has a bit of an interplay between a custom PC program he wrote and the Arduino Nano. The learning aspect of his algorithm is done on a PC, but the implementation is done in real-time on the Arduino Nano, a typical approach for really any machine learning algorithm deployed on a microcontroller. To capture sample audio commands, or utterances, [Peter] first had to optimize the Nano’s ADC so he could get sufficient sample rates for speech processing. Doing a bit of low-level programming, he achieved a sample rate of 9ksps, which is plenty fast for audio processing.

    https://www.instructables.com/Speech-Recognition-With-an-Arduino-Nano/

    Reply
  37. Tomi Engdahl says:

    In this EloquentArduino post, Simone Salerno shows how to load TensorFlow Lite models onto your WiFi-equipped Arduino from the Internet.

    https://eloquentarduino.github.io/2021/06/howto-load-tensorflow-lite-tinyml-model-from-internet-on-arduino/

    Reply
  38. Tomi Engdahl says:

    A multi-tier approach to #MachineLearning at the edge can help streamline both development and deployment for the #AIoT
    #EVS21 #AI #IoT
    https://buff.ly/3ioEPHD

    Reply
  39. Tomi Engdahl says:

    This tinyML system monitors hive health by listening to and analyzing the sound of bee buzzing with a Nano 33 BeeLE.

    BeeMonitor © MIT
    Assistant for efficient beekeeping!
    https://create.arduino.cc/projecthub/delfin-ki/beemonitor-96f5d7

    Reply
  40. Tomi Engdahl says:

    Never again will you have to check whether your water is boiling! The Nano 33 BLE Sense-based Boilarm listens to the pot on your stove and notifies you with a smartphone alert.

    Boilarm
    https://create.arduino.cc/projecthub/kuharji/boilarm-267661

    Never again do you have to check whether your water is boiling. Put on your headphones and let your phone notify you!

    Reply
  41. Tomi Engdahl says:

    This wand uses a tinyML model on the Nano 33 BLE Sense to recognize when you draw constellations in the sky.

    https://experiments.withgoogle.com/astrowand

    Reply
  42. Tomi Engdahl says:

    Gesture-Detecting Macro Keyboard Knows What You Want
    https://hackaday.com/2021/06/13/gesture-detecting-macro-keyboard-knows-what-you-want/

    [jakkra] bought a couple of capacitive touchpads from a Kickstarter a few years ago and recently got around to using them in a project. And what a project it is: this super macro pad combines two touchpads with a 6-pack of regular switches for a deluxe gesture-sensing input device.

    Inside is an ESP32 running TensorFlow Lite to read in the gestures from the two touchpads. The pad at the top is a volume slider, and the square touchpad is the main input and is used in conjunction with the buttons to run AutoHotKey scripts within certain programs. [jakkra] can easily run git commands and more with a handful of simple gestures.

    https://github.com/jakkra/Gesture-Detecting-Macro-Keyboard

    Reply
  43. Tomi Engdahl says:

    MLPerf Tiny Inference is a new benchmarking suite for ultra-low-power tinyML devices.

    How Does Your TinyML Stack Up?

    https://www.hackster.io/news/how-does-your-tinyml-stack-up-4bef125cf374?ed2357bbbd318d584d579d024d9fd808
    MLPerf Tiny Inference benchmarking suite explores how tinyML algorithms compare with one another.

    Reply
  44. Tomi Engdahl says:

    Arduino recently joined the list of Picovoice’s supported devices, starting with the Nano 33 BLE Sense.

    Picovoice, an Offline Voice Recognition SDK, Adds Support for Arduino Boards
    Operating entirely offline, Picovoice is now available on microcontrollers as well as microcomputers.
    https://www.hackster.io/news/picovoice-an-offline-voice-recognition-sdk-adds-support-for-arduino-boards-26c4e82debda

    Picovoice, the offline speech recognition and wake word detection specialist, has shown off a new feather in its cap: support for Arduino-compatible microcontrollers.

    Picovoice announced its wake word and speech-to-intent software development kit, which offers support for fully-offline operation, two years ago with a demonstration of it running on a Raspberry Pi single-board computer. The latest version, though, extends the same capabilities to microcontrollers

    “Did you know it’s now possible to perform high-accuracy speech recognition on a microcontroller,” Picovoice’s Mohammadreza Rostam writes in the introduction to a demonstration of just that, brought to our attention by CNX Software. “Advances in machine learning have brought voice capability to these extremely resource-constrained devices, via the Picovoice SDK. Arduino recently joined the list of supported devices, starting with the Arduino Nano 33 BLE Sense.”

    More information, and links to source code and documentation, is available on Rostam’s Medium post.
    https://medium.com/picovoice/offline-voice-ai-on-arduino-4a7f7e572bfb

    Reply
  45. Tomi Engdahl says:

    Not quite Dug’s collar from the Pixar movie Up, but this small device utilizes the powers of tinyML to recognize what your canine companion wants based on vocal signals.

    PUPPI is a tinyML device designed to interpret your dog’s mood via sound analysis
    https://blog.arduino.cc/2021/06/18/puppi-is-a-tinyml-device-designed-to-interpret-your-dogs-mood-through-sound-analysis/

    Dogs are not known to be the most advanced communicators, so figuring out what they want based on a few noises and pleading looks can be tough. This problem is what inspired a team of developers to come up with PUPPI — a small device that utilizes tinyML to interpret your canine companion’s mood through vocal signals. Their project employs an Arduino Nano 33 BLE Sense and its onboard microphone to both capture the data and run inferencing with the model they trained using Edge Impulse. After collecting ample amounts of data for barks, growls, whines, and other noises, their model achieved an accuracy of around 92%.

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*