Machine learning possible on microcontrollers

ARM’s Zach Shelby introduced the use of microcontrollers for machine learning and artificial intelligence at the ECF19 event in Helsinki on last Friday. The talk showed that that artificial intelligence and machine learning can be applied to small embedded devices in addition to the cloud-based model. In particular, artificial intelligence is well suited to the devices of the Internet of Things. The use of machine learning in IoT is also sensible from an energy efficiency point of view if unnecessary power-consuming communication can be avoided (for example local keyword detection before sending voice data to cloud more more detailed analysis).

According to Shelby , we are now moving to a third wave of IoT that comes with comprehensive equipment security and voice control. In this model, machine learning techniques are one new application that can be added to previous work done on IoT.

In order to successfully use machine learning in small embedded devices, the problem to be solved is that it has reasonably little incoming information and a very limited number of possible outcomes. ARM Cortex M4 processor equipped with a DSP unit is powerful enough for simple hand writing decoding or detecting few spoken words with machine learning model. In examples the used machine learning models needed less than 100 kilobytes of memory.

zackdscf6473

The presentation can be now viewed on YouTube:

Important tools and projects mentioned on the presentation:

TinyML

TensorFlow Lite

uTensor (ARM MicroTensor)

TensorFlow Lite Micro

Articles on presentation:

https://www.uusiteknologia.fi/2019/05/20/ecf19-koneoppiminen-mahtuu-mikro-ohjaimeen/

http://www.etn.fi/index.php/72-ecf/9495-koneoppiminen-mullistaa-sulautetun-tekniikan

 

420 Comments

  1. Tomi Engdahl says:

    This smart glove uses an Arduino Nano 33 BLE Sense running TinyML to send a cyclist’s hand gestures to a backpack-mounted LED matrix.

    Improving Cyclist Safety with a Smart Glove Signaling System
    This wearable device uses an Arduino running TinyML to send hand gestures to a backpack-mounted LED matrix over BLE.
    https://www.hackster.io/news/improving-cyclist-safety-with-a-smart-glove-signaling-system-cf1014041fc1

    Reply
  2. Tomi Engdahl says:

    Use an OV7670 camera module with the Nano 33 BLE Sense to obtain image data for TinyML applications.

    Machine vision with low-cost camera modules
    https://blog.arduino.cc/2020/06/24/machine-vision-with-low-cost-camera-modules/

    If you’re interested in embedded machine learning (TinyML) on the Arduino Nano 33 BLE Sense, you’ll have found a ton of on-board sensors — digital microphone, accelerometer, gyro, magnetometer, light, proximity, temperature, humidity and color — but realized that for vision you need to attach an external camera.

    In this article, we will show you how to get image data from a low-cost VGA camera module. We’ll be using the Arduino_OVD767x library to make the software side of things simpler.

    Reply
  3. Tomi Engdahl says:

    tinyML development with Tensorflow Lite for Microcontrollers using CMSIS-NN and Ethos-U55
    https://m.youtube.com/watch?feature=youtu.be&t=2392&v=TPD3MnL1nTc

    Reply
  4. Tomi Engdahl says:

    This ML-powered bot sits on your desk and monitors your online comments.

    Build a comment-critiquing keyboard adapter using TensorFlow Lite and Arduino
    https://blog.arduino.cc/2020/07/07/build-a-comment-critiquing-keyboard-keyboard-using-tensorflow-lite-and-arduino/

    If you’ve ever left an online comment that you later regretted, this anti-troll bot will keep that from happening again by letting you know when you’re being a bit too harsh.

    The device — which was created by Andy of element14 Presents — intercepts raw keyboard inputs using a MKR Zero board and analyzes them using a TensorFlow Lite machine learning algorithm.

    As an output, the Arduino controls the mouth of a rather hilarious human cutout via a servo motor

    The project is a great example of running ML code on limited hardware, and more info on the sentiment-analyzing keyboard adapter can be found here.

    https://www.element14.com/community/docs/DOC-95217/l/episode-453-build-an-anti-troll-bot-using-tensorflow-and-arduino

    Reply
  5. Tomi Engdahl says:

    SEFR is a binary classifier for Arduino and other ultra-low power devices: https://bit.ly/2CxbiZq

    Reply
  6. Tomi Engdahl says:

    Person detection on the Nano 33 BLE Sense gets a 16X speed boost thanks to Arm CMSIS-NN optimizations in TensorFlow Lite for Microcontrollers. Just a one-click install in the Arduino IDE!

    tinyML development with Tensorflow Lite for Microcontrollers using CMSIS-NN and Ethos-U55 | Arm
    https://m.youtube.com/watch?v=TPD3MnL1nTc&t=2572&feature=youtu.be

    Deep Neural Networks are becoming increasingly popular in always-on endpoint devices. Developers can perform data analytics right at the source, with reduced latency as well as energy consumption. During this talk, we will introduce how Tensorflow Lite for Microcontrollers (TFLu) and its integration with CMSIS-NN will maximize the performance of machine learning applications. Developers can now run larger, more complex neural networks on Arm MCUs and micro NPUs while reducing memory footprint and inference time.

    Reply
  7. Tomi Engdahl says:

    Bright idea! This Arduino and Python-powered task light uses object detection to track and follow your hand.

    A hand-following AI task lamp for your desk
    https://blog.arduino.cc/2020/07/15/an-ai-powered-lamp-for-your-desk/

    As you work on a project, lighting needs change dynamically. This can mean manual adjustment after manual adjustment, making do with generalized lighting, or having a helper hold a flashlight. Harry Gao, however, has a different solution in the form of a novel robotic task lamp.

    https://hackaday.io/project/173712-3d-printed-hand-following-ai-task-light

    Reply
  8. Tomi Engdahl says:

    Here’s a great intro to tinyML, featuring Arm’s Wei Xiao and Pete Warden (pictured here waving around a Nano 33 BLE Sense).

    Google and Arm: tinyML
    https://m.youtube.com/watch?v=ZH6TVCp5K0c

    Reply
  9. Tomi Engdahl says:

    Edge Impulse partners with Open MV to add computer vision support for Cortex-M class devices.

    Tiny Computer Vision for All Embedded Devices
    Edge Impulse has partnered with OpenMV to add computer vision support for Cortex-M class devices.
    https://www.hackster.io/news/tiny-computer-vision-for-all-embedded-devices-ce909feef1db

    Reply
  10. Tomi Engdahl says:

    Run a gesture recognition model on your Nano 33 BLE for use with a Raspberry Pi-powered game!

    Machine Learning with Nano BLE 33 & Raspberry Pi © MIT
    https://create.arduino.cc/projecthub/zoromoth/machine-learning-with-nano-ble-33-raspberry-pi-958b2f

    In this project we take a look at making the Arduino Nano 33 BLE learn a few gestures and how to integrate it with a game!

    Reply
  11. Tomi Engdahl says:

    Embedding #AI in smart #sensors STMicroelectronics NV #MachineLearning #OnEdgeProcessing #MEMS https://buff.ly/2BtmGFB

    Reply
  12. Tomi Engdahl says:

    Researchers have released a linear-time, high-performance image classifier designed to run on low-power devices — and Eloquent Arduino’s Simone has ported the C implementation to the Eloquent ML library for microcontroller users.

    SEFR Algorithm Performs Image Classification, Including Training, on an Arduino Uno or Other MCU
    https://www.hackster.io/news/sefr-algorithm-performs-image-classification-including-training-on-an-arduino-uno-or-other-mcu-8a707c2e18aa

    Now available as part of the Eloquent ML machine learning library, or directly as Python and C example implementations.

    Researchers from the Tarbiat Modares and Boston Universities have released a linear-time high-performance image classifier designed to run on low-power devices — and Eloquent Arduino’s Simone has ported the C implementation to the Eloquent Machine Learning (Eloquent ML) library for microcontroller users.

    “One of the fundamental challenges for running machine learning algorithms on battery-powered devices is the time and energy needed for computation, as these devices have constraints on resources,” the research team explains of its work on SEFR. “There are energy-efficient classifier algorithms, but their accuracy is often sacrificed for resource efficiency. Here, we propose an ultra-low power binary classifier, SEFR, with linear time complexity, both in the training and the testing phases.”

    “The SEFR method runs by creating a hyperplane to separate two classes. The weights of this hyperplane are calculated using normalization, and then the bias is computed based on the weights. SEFR is comparable to state-of-the-art classifiers in terms of classification accuracy, but its execution time and energy consumption are 11.02% and 8.67% of the average of state-of-the-art and baseline classifiers.”

    In testing, Simone found that his port of sefr could achieve 100 percent accuracy on an iris classification task with just four features, 89 percent for breast cancer classification at 30 features, 84 percent for wine at 13 features, and 99 percent accuracy for digits at 64 features. “Considering that the model only needs one weight per feature,” Simone notes, “I think this results are impressive!”

    “We have implemented SEFR on Arduino Uno, and on a dataset with 100 records and 100 features, the training time is 195 milliseconds, and testing for 100 records with 100 features takes 0.73 milliseconds. To the best of our knowledge, this is the first multi-purpose algorithm specifically devised for learning on ultra-low power devices.”

    The SEFR paper is available under open-access terms on arXiv.org now; the original C and Python implementations are on GitHub under an unspecified license.

    https://github.com/sefr-classifier/sefr

    Simone’s Python package is available on GitHub under the permissive MIT License, or as part of the Eloquent Micro ML repository.

    Reply
  13. Tomi Engdahl says:

    Simone’s Python package is available on GitHub under the permissive MIT License, or as part of the Eloquent Micro ML repository.

    https://github.com/eloquentarduino/sefr

    https://github.com/eloquentarduino/EloquentMicroML

    Reply
  14. Tomi Engdahl says:

    This Arduino-based baby monitor uses Edge Impulse ML to recognize crying from other noises, preventing false alarms and alerting parents only when needed.

    BABL: a Baby Monitor Powered by tinyML and Edge Impulse!
    https://www.hackster.io/ishotjr/babl-a-baby-monitor-powered-by-tinyml-and-edge-impulse-f5045f

    BABL leverages tinyML to distinguish a baby’s cry from other noise, preventing false alarms, and alerting parents only when needed.

    Reply
  15. Tomi Engdahl says:

    NXP Customizes AI Compiler for MCU Products
    https://www.eetimes.com/nxp-customizes-ai-compiler-for-mcu-products/

    In a sign that machine learning techniques are fast gaining adoption on embedded platforms, NXP announced that it has created a customized implementation of Glow for microcontrollers (MCU), including some of its i.MX RT family. Glow is a neural network compiler that optimizes neural networks for specific target hardware.

    NXP is the first of the microcontroller vendors to create a customized version of Glow for its hardware. It has done so for the Cortex-M cores and Tensilica HiFi4 DSP core on its i.MX RT685, RT 1050 and RT1060 microcontrollers.

    Reply
  16. Tomi Engdahl says:

    The future for billions of edge devices is about to drastically change with new innovations from companies like Google AI, Arm and Edge Impulse, who are giving basic hardware eyes, ears, and brains using TinyML.

    Can Edge Devices See, Hear and Think?
    https://www.hackster.io/news/can-edge-devices-see-hear-and-think-a6b9b8f2ac23

    TinyML, a cyborg fantasy for edge devices. Innovations from Google, Arm, Edge Impulse, and others are game changers for edge computing.

    Reply
  17. Tomi Engdahl says:

    Run a TensorFlow Lite speech recognition model on a Nano 33 BLE Sense to control your robotic vehicle via voice commands!

    Arduino Machine Learning: Build a Tensorflow lite model to control robot-car
    https://www.survivingwithandroid.com/arduino-machine-learning-tensorflow-lite/

    To build this project there are at least, two steps:

    train a new machine learning model and adapt it to run on Arduino
    Build the car that uses the Tensorflow trained in the previous step
    As you may already know, we can’t run directly Tensorflow models on a Arduino because this device has limited resources. Therefore, it is necessary, after the model is trained, to shrink it reducing its size. We will describe step by step how to build the model and then how to convert it in a way that is compatible with Arduino.

    To recongize voice commands using Arduino, we need a Tensorflow model that uses CNN to do it. The first step is acquiring the voice using the Arduino Nano 33 built-in microphone and apply to it the FFT. The data extracted using the Fast Fourier Transformation will feed the CNN. The problem to recongize commands is a classification problem.

    Reply
  18. Tomi Engdahl says:

    This Nano 33 BLE Sense-based baby monitor uses Edge Impulse and tinyML to recognize crying from other sounds, preventing false alarms and allowing parents to enjoy those precious few moments of sleep!

    BABL: A Baby Monitor Powered by tinyML and Edge Impulse! © MIT
    https://create.arduino.cc/projecthub/ishotjr/babl-a-baby-monitor-powered-by-tinyml-and-edge-impulse-f5045f

    BABL leverages tinyML to distinguish a baby’s cry from other noise, preventing false alarms, and alerting parents only when needed.

    With Edge Impulse’s recent announcement of support for the popular Arduino Nano 33 BLE Sense, the exciting world of Machine Learning is now easily accessible to even novice Arduino developers. To demonstrate how quick and easy it is to get started and create results with Edge Impulse Studio, I wanted to target a conventional application that could be dramatically improved with machine learning. Baby monitors work by transmitting audio wirelessly from a transmitter in the baby’s room to a receiver which the parent can monitor.

    Reply
  19. Tomi Engdahl says:

    SEFR Algorithm Performs Image Classification, Including Training, on an Arduino Uno or Other MCU
    Now available as part of the Eloquent ML machine learning library, or directly as Python and C example implementations.
    https://www.hackster.io/news/sefr-algorithm-performs-image-classification-including-training-on-an-arduino-uno-or-other-mcu-8a707c2e18aa

    Reply
  20. Tomi Engdahl says:

    GaussianNB on Arduino: Easy-to-train, top-accuracy fast classifier!

    EloquentML grows its family of classifiers: Gaussian Naive Bayes on Arduino
    2 AUGUST 2020

    https://eloquentarduino.github.io/2020/08/eloquentml-grows-its-family-of-classifiers-gaussian-naive-bayes-on-arduino/

    Are you looking for a top-performer classifiers with a minimal amount of parameters to tune? Look no further: Gaussian Naive Bayes is what you’re looking for. And thanks to EloquentML you can now port it to your microcontroller.

    (Gaussian) Naive Bayes
    Naive Bayes classifiers are simple models based on the probability theory that can be used for classification.

    They originate from the assumption of independence among the input variables. Even though this assumption doesn’t hold true in the vast majority of the cases, they often perform very good at many classification tasks, so they’re quite popular.

    Gaussian Naive Bayes stack another (mostly wrong) assumption: that the variables exhibit a Gaussian probability distribution.

    Reply
  21. Tomi Engdahl says:

    EloquentArduino shows how to implement a wireless indoor positioning system using a MKR WiFi 1010 board and machine learning.
    The Ultimate Guide to Wifi Indoor Positioning using Arduino and Machine Learning
    8 AUGUST 2020
    https://eloquentarduino.github.io/2020/08/the-ultimate-guide-to-wifi-indoor-positioning-using-arduino-and-machine-learning/

    This will be the most detailed, easy to follow tutorial over the Web on how to implement Wifi indoor positioning using an Arduino microcontroller and Machine Learning. It contains all the steps, tools and code from the start to the end of the project.

    Reply
  22. Tomi Engdahl says:

    Writing for EE Journal, Max Maxfield shares his experience building his first tinyML app with the Nano 33 IoT.

    https://www.eejournal.com/article/i-just-created-my-first-ai-ml-app-part-2/

    Reply
  23. Tomi Engdahl says:

    Welcome to the Wizarding World of tinyML! Create a Harry Potter-like wand that responds to “Lumos“ and “Nox” commands using an Edge Impulse audio recognition model on your Nano 33 BLE Sense.

    Audio Recognition with Embedded Machine Learning
    TINYMLJAN FIEDLER AUG 02, 2020
    https://jan-fiedler.net/tinyml-audio-recognition/

    How do Harry Potter, TinyML and an Arduino match each other, you may wonder? Follow along and see how to become a wizard using embedded machine learning.

    Recently I got aware of something called tiny machine learning or TinyML. TinyML is basically the latest technology for embedded systems, where deep learning and tiny devices are combined to create something very potential. Traditional machine learning often runs on big powered hardware somewhere in the cloud and therefore requires a good amount of resources.

    Reply
  24. Tomi Engdahl says:

    In his latest Eloquent Arduino blog post, Simone Salerno achieves word classification on the Nano 33 BLE Sense with 96% accuracy.

    https://eloquentarduino.github.io/2020/08/better-word-classification-with-arduino-33-ble-sense-and-machine-learning/

    Reply
  25. Tomi Engdahl says:

    “I’d recommend the Nano 33 BLE Sense as a good starter board for tinyML. It has a speedy but low-power Cortex-M4F processor, and enough RAM for some interesting models.”

    Check out this great interview with Edge Impulse’s Daniel Situnayake on the potential of embedded machine learning!

    (via Elektor Labs)

    https://www.elektormagazine.com/news/the-future-of-machine-learning-an-interview-with-daniel-situnayake/23577

    Reply
  26. Tomi Engdahl says:

    “But thanks to recent advances companies are turning to tinyML as the latest trend in building product intelligence. Arduino is making tinyML available for millions of developers, and now together with Edge Impulse, they are turning the ubiquitous Arduino board into a powerful embedded ML platform.”

    (via The Next Web)

    https://thenextweb.com/neural/2020/09/03/tinyml-is-breathing-life-into-billions-of-devices/

    Reply
  27. Tomi Engdahl says:

    Watch Fredrik Knutsson’s person detection demo on the Nano 33 BLE Sense to see what CMSIS-NN and TensorFlow Lite can do in terms of a performance uplift for tinyML applications.

    tinyML development with TensorFlow Lite and CMSIS-NN
    https://m.youtube.com/watch?feature=youtu.be&v=wVXnO9lu2BI

    Reply
  28. Tomi Engdahl says:

    Simone Salerno proposes a partial, naive linear-time implementation of the Fourier transform for fast feature extraction on Arduino and embedded microcontrollers.

    “Principal” FFT components as efficient features extrator
    https://eloquentarduino.github.io/2020/09/principal-fft-components-as-efficient-features-extrator/

    Sadly, computing the transform over the whole spectrum of the signal still requires O(NlogN) with the best implementation (FFT – Fast Fourier Transform); we would like to achieve faster computation on our microcontrollers.

    In this post I propose a partial, naive linear-time implementation of the Fourier Transform you can use to extract features from your data for Machine Learning models.

    we don’t actually need a full description of the input signal: we’re only interested in extracting some kind of signature that a ML model can use to distinguish among the different classes.

    I was thinking to a kind of PCA (Principal Component Analysis), but using FFT spectrum as features.

    Reply
  29. Tomi Engdahl says:

    Machine learning at the edge for about the price of a fancy coffee!

    Official TensorFlow Lite Micro Support Comes to the ESP32
    https://www.hackster.io/news/official-tensorflow-lite-micro-support-comes-to-the-esp32-9708fb6a760f

    Machine learning at the edge for about the price of a fancy coffee

    With the advent of TensorFlow Lite, and subsequently TensorFlow Lite Micro, machine learning has moved from CPUs and GPUs to mobile phones and SBCs, then finally to inexpensive MCUs. Boards like the Arduino Nano 33 BLE Sense and STM32F746 Discovery kit brought machine learning to devices whose cost is in the range of a night out, rather than a month’s rent. But the announcement of TensorFlow Lite Micro support for the ESP32 means development targets with a cost more inline with a fancy coffee!

    https://blog.tensorflow.org/2020/08/announcing-tensorflow-lite-micro-esp32.html?m=1

    Reply
  30. Tomi Engdahl says:

    Available as source and a Python package, Principal FFT is designed to rapidly and accurately extract features from data.

    Eloquent Arduino’s Principal FFT Offers High-Accuracy Machine Learning Feature Extraction
    https://www.hackster.io/news/eloquent-arduino-s-principal-fft-offers-high-accuracy-machine-learning-feature-extraction-f408b48c90d2

    Available as source and a Python package, Principal FFT is designed to rapidly and accurately extract features from data.

    Reply
  31. Tomi Engdahl says:

    Scan a plant’s leaves using an Arduino Nano 33 BLE Sense and train a tinyML model with Edge Impulse to detect if it’s diseased.

    Determining a Plant’s Health with TinyML
    https://www.hackster.io/gatoninja236/determining-a-plant-s-health-with-tinyml-085003

    Scan the leaves of a plant with an Arduino Nano 33 BLE Sense and train a model to detect if it’s diseased.

    Reply
  32. Tomi Engdahl says:

    Qeexo’s AutoML platform brings out-of-the-box machine learning to Arduino Cortex-M0+ boards like the Nano 33 IoT: https://bit.ly/3bGyh1O

    Reply
  33. Tomi Engdahl says:

    Compiling And Optimizing Neural Nets
    Inferencing with lower power and improved performance.
    https://semiengineering.com/compiling-and-optimizing-neural-nets/

    Edge inference engines often run a slimmed-down real-time engine that interprets a neural-network model, invoking kernels as it goes. But higher performance can be achieved by pre-compiling the model and running it directly, with no interpretation — as long as the use case permits it.

    At compile time, optimizations are possible that wouldn’t be available if interpreting. By quantizing automatically, merging nodes and kernels, and binding variables into constants where feasible, substantial efficiency improvements can be achieved. That means inference will run more quickly with less power.

    “Running a neural network efficiently and achieving good performance on an edge device have two significant challenges — processing compute intensive convolutions and manipulating large amounts of data,” said Steve Steel, director of product marketing, machine-learning group at Arm. “Both of these challenges must be solved in order to realize a balanced system design.”

    A typical configuration has the inference engine running a limited version of one of the machine-learning frameworks, like TensorFlow Lite. That engine takes a relatively abstract version of the neural-network model and interprets the required execution, calling on small, self-contained programs called “kernels” that perform various functions. A convolution would be run by a kernel. An activation function might be another kernel. The run-time interpreter invokes the kernels as it processes the network.

    But interpretation adds a layer of computing to the whole problem. In addition to doing the actual inference, additional computing is needed to reduce the high level of the model to specific execution instructions. That takes time and energy to do.

    Reply
  34. Tomi Engdahl says:

    VIA Pixetto Is a Smart Scratch, Python, and TensorFlow Lite Computer Vision Platform for Schools
    Programmable in Scratch, Python, JavaScript, and TensorFlow Lite, the Pixetto can be used with Arduinos, BBC micro:bits, and Raspberry Pis
    https://www.hackster.io/news/via-pixetto-is-a-smart-scratch-python-and-tensorflow-lite-computer-vision-platform-for-schools-62d48967cc9c

    Reply
  35. Tomi Engdahl says:

    Qeexo AutoML Shrinks Automated Machine Learning Footprint to Fit Cortex-M0(+)
    Automated ML platform offers support for Arm’s tiniest cores — an industry first.
    https://www.hackster.io/news/qeexo-automl-shrinks-automated-machine-learning-footprint-to-fit-cortex-m0-91067d19e598

    As the power of machine learning migrates from large GPUs to mobile phones on down to mid-range microcontrollers, tools like Edge Impulse Studio enable developers to manage the entire pipeline in easy-to-use software, then deploy to Cortex-M7, M4, or even M3 targets.

    https://www.edgeimpulse.com/

    Reply
  36. Tomi Engdahl says:

    Let This Crying Detecting Classifier Offer Some Much Needed Reprieve
    https://hackaday.com/2020/09/15/let-this-crying-detecting-classifier-offer-some-much-needed-reprieve/

    Baby monitors are cool, but [Ish Ot Jr.] wanted his to only transmit sounds that required immediate attention and filter any non-emergency background noise. Posed with this problem, he made a baby monitor that would only send alerts when his baby was crying.

    For his project, [Ish] used an Arduino Nano 33 BLE Sense due to its built-in microphone, sizeable RAM for storing large chunks of data, and it’s BLE capabilities for later connecting with an app. He began his project by collecting background noise using Edge Impulse Studio’s data acquisition functionality. [Ish] really emphasized that Edge Impulse was really doing all the work for him.

    BABL: A Baby Monitor Powered by tinyML and Edge Impulse!
    BABL leverages tinyML to distinguish a baby’s cry from other noise, preventing false alarms, and alerting parents only when needed.
    https://www.hackster.io/ishotjr/babl-a-baby-monitor-powered-by-tinyml-and-edge-impulse-f5045f

    Reply
  37. Tomi Engdahl says:

    In development for a year now, Adafruit Industries’ BrainCraft HAT is finally available — and it has some great design tweaks, including a cooling fan!

    Adafruit Launches BrainCraft HAT for Raspberry Pi as an All-in-One TensorFlow Lite Dev Platform
    https://www.hackster.io/news/adafruit-launches-braincraft-hat-for-raspberry-pi-as-an-all-in-one-tensorflow-lite-dev-platform-6c18259279c0

    In development for a year now, the BrainCraft HAT is finally available — and it has some great design tweaks, including a cooling fan.

    Reply
  38. Tomi Engdahl says:

    This ESP32-based device uses sensor data long with machine learning to recognize different events that take place inside your house.

    Placemon Uses ML to Monitor Homes
    https://www.hackster.io/news/placemon-uses-ml-to-monitor-homes-a933db609279

    This ESP32-based device uses sensor data long with machine learning to recognize different events that take place inside your house.

    The IoT has allowed us to keep tabs on just about everything in our homes, from refrigerators to entertainment systems, and all things in between. While there are many home monitoring kits on the market, others prefer to design their own, which is what Hackaday user ‘anfractuosity’ did with Placemon – an open hardware platform that senses what’s happening in the home.

    Training TensorFlow is done using easily repeatable events, like kitchen sounds and operating appliances. Once the specific actions can be identified, Placemon will send wireless notifications of the triggered occurrence. At least that what anfractuosity hopes it will do eventually, as the platform is still in development.

    Placemon – sense your home
    An open hardware project to sense what is happening in your home
    https://hackaday.io/project/173368-placemon-sense-your-home

    Reply
  39. Tomi Engdahl says:

    Use a TensorFlow Lite model on a Nano 33 BLE Sense to control a robotic car via voice commands.

    Arduino Machine Learning: Build a Tensorflow lite model to control robot-car
    https://www.survivingwithandroid.com/arduino-machine-learning-tensorflow-lite/

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*