Since 2009, coders have created thousands of experiments using Chrome, Android, AI, WebVR, AR and more. We showcase these projects and a variety of helpful tools and resources to inspire a diverse community of makers to explore, create, and share what’s possible with these technologies.
FUI (Finger User Interface) is a part of the TensorFlow Lite for Microcontroller Experiments, a collection of open source, interactive projects designed to demonstrate some fun ways to combine Arduino and TensorFlow Lite for Microcontrollers.
These projects were built with the Arduino Sense 33 BLE, TensorFlow Lite for Microcontrollers, standard web technologies ( HTML, CSS & Javascript ) and p5.js.
Finger User Interface or FUI (pronounced Foo-ey) lets you control connected devices with the wave of a finger.
Other experiments to explore:
- Air Snare lets you play the drums in the air.
- Tiny Motion Trainer lets you train and test IMU based TFLite models in the browser.
- Morning Mountain lets you stop your alarm clock from ringing by striking a pose.
- Astrowand lets you draw shapes in the air to form constellations.
- Linux, MacOS or Windows computer with Chrome installed
- TensorFlow Lite for Microcontroller Challenge Kit or Arduino Nano BLE Sense 33
- Micro USB cable (If you're on a USB-C laptop, instead get a USB-C to Micro USB cable)
- Rubberband
- [Optional] Battery
Flashing: Using the Arduino Nano Sense 33 BLE
-
Install the Arduino IDE
-
Setup Arduino board:
-
Plug in the board
-
Install the board by navigating to Tools > Board > Boards Manager and search for Arduino Mbed OS Nano Boards. Full instructions (including drivers required for Windows) here.
-
FAQ for connection problems can be found here.
-
After the board is installed, select it under to Tools > Board > Arduino Mbed OS Nano Boards > Arduino Nano 33 BLE
-
Select the port by navigating to Tools -> Port -> dev/cu... (Arduino Nano 33 BLE)
- Install Arduino libraries
- Navigate to Tools > Manage Libraries
- Search for and install:
- Arduino_LSM9DS1
- ArduinoBLE
- Arduino_TensorFlowLite
- Open the sketch and flash
-
Download the latest relase of tf4micro motion kit here
-
Open the arduino/tf4micro-motion-kit and double click on <tf4micro-motion-kit.ino> file
-
Click the Right arrow in the top left corner to build and upload the sketch.
-
Warning: This process may take a few minutes. Also, warnings may populate but the upload should still succeed in spite of them.
-
If the sketch is installed, the LED on the board should flash red and green.
- Go to the URL related to the experiment. The URL can be found below and play!
The board that comes with the TensorFlow Microcontroller Challenge Kit by SparkFun comes preflashed with a sketch that will work with some of the experiments right out of the box. If you are using one of the “TensorFlow Micro” kits and you just want to jump right into playing with the experiments then you can simply connect your arduino to a power source (USB or Battery) and connect to one of the following experiment URLs:
What exactly is being transferred when I “connect”?
When you’re connecting the board to your computer, a pre-trained TensorFlow Lite machine learning model gets transferred over BLE onto the device. The sketches that are uploaded to the Arduino include a common TensorFlow Lite for Microcontrollers Experiments model architecture. The different experiment websites change the behavior of the sketch by changing the model to one specifically made for the experience.
What if I’m having issues connecting via bluetooth?
If you are having issues connecting try the following:
- Make sure your browser (Chrome or Edge) supports Bluetooth and it is enabled.
- Make sure your device (laptop, phone, etc) supports Bluetooth and that it is working and enabled.
- Refresh the web page, unplug the Arduino power cable and then plug it back in to reset, then try connecting again.
NOTE: If you’re using a managed device, like a computer from school or work, your device policy may prevent BLE pairing.
My board isn’t showing up on my computer, even though it’s plugged in. What should I do?
Try unplugging the Arduino power cable and then plug it back in to reset. Make sure you see the RGB LED blink red, green, blue in a sequential order.
The model isn’t getting my movements right. What do I do?
The way you move may be different from the data we used to pre-train the model. Different people move differently. That’s why we created Tiny Motion Trainer, which lets you train a custom model based on the way you move.
Do you have plans to support other boards?
We made these projects to work specifically with the Arduino Nano, and we currently don’t have plans to expand support. However, all of the code is open sourced, so you can remix or modify as needed.
Where should I go from here if I want to make my own model or project?
You can create your own model in several different ways. Check out these links:
- Experiments Collection - Inspiration and more resources Tiny Motion Trainer - Code-free motion trainer for microcontrollers
- Teachable Machine - Code-free image model trainer
- TensorFlow Lite for Microcontrollers - Full documentation
- Free Harvard EdX Course - In-depth course on TensorFlow Lite for Microcontrollers and the TinyML Ecosystem `
"What sensors do the experiments use?"
The IMU is a LSM9DS1. It is a 3-axis accelerometer, 3-axis gyroscope and 3-axis magnetometer. This chip, made by ST Microelectronics, is a standard component supported by our library ArduinoLSM9DS1. Read more here: https://www.arduino.cc/en/Guide/NANO33BLESense
How do you shrink a TensorFlow model to fit on a microcontroller?
Post-training quantization is a conversion technique that can reduce model size while also improving CPU and hardware accelerator latency, with little degradation in model accuracy. You can quantize an already-trained float TensorFlow model when you convert it to TensorFlow Lite format using the TensorFlow Lite Converter. Read more here: https://www.tensorflow.org/lite/performance/post_training_quantization
This is not an official Google product, but a collection of experiments that were developed at the Google Creative Lab. This is not a library or code repository that intends to evolve. Instead, it is a snapshot alluding to what’s possible at this moment in time.
We encourage open sourcing projects as a way of learning from each other. Please respect our and other creators’ rights, including copyright and trademark rights when present, when sharing these works and creating derivative work. If you want more info on Google's policy, you can find that here.