Weights & Biases with Edge Impulse
Building a System of Record for Edge ML
Created on November 17|Last edited on March 26
Comment
Below is the guide on how to Integrate Edge Impulse to Weights & Biases and get started with tracking metrics within Weights & Biases
How small is Tiny?Enter the Micro ControllerSetupDatasetTraining 🏃🏻♀️Training Weights & Biases Sweep 🧹Validation MetricsTraining Metrics Test Set (unseen data) Related Reports
How small is Tiny?
Enter the Micro Controller
Presently there are a growing number of edge applications such that are typically similar that devices such as Nvidia Jetson, or mobile phones.
Edge Impulse enabled users to connect to edge devices and weights and biases can be installed on any device that can have an operating system. There are devices however known as microcontrollers that have much more limited constraints, these devices typically have ram in the order of kilobytes and pre-compiled models and programs.
These devices cannot have full python and generally run best on C++ compiled binaries. Edge impulse has developed methods for codeless deployment and conducting data pre-processing pipeline in
Setup
- Install Node.js
brew install node
2. Setup wandb and EI accounts - You need an EI and W&B accounts to set up the connectivity.
3. Install EI CLI
Use the following command to install EI CLI
npm install -g edge-impulse-cli
4. Pull the Data from EI
edge-impulse-blocks runner --download-data
This will download your project with wandb pre-processed data
Dataset
This is a snapshot of the Dataset from EdgeImpulse

This is the dataset in W&B Table. You can play the audio files within the Tables itself to listen to the audio files.
Run: upbeat-dust-1333
1
Training 🏃🏻♀️
Running and training Sweeps: W&B EI custom block integration:
Add the corresponding inputs and run the following command to start running Sweeps
With Docker 🐳
docker run --rm -v $PWD:/scripts wandb-custom-block
--x-file ei-block-data/131527/X_train_features.npy --y-file ei-block-data/131527/y_train.npy --epochs 100 --learning-rate 0.005
--validation-set-size 0.2 --input-shape "(6640,)"
--out-directory out --wandb-api your_api --sweep
With WandB Lauch CLI 🚀
wandb launch -d launch-quickstart -p birdsong-classifier -a x-file=ei-block-data/131527/X_train_features.npy
-a y-file=ei-block-data/131527/y_train.npy -a epochs=100 -a learning-rate=0.005 -a validation-set-size=0.2
-a input-shape="(6640,)" -a out-directory=out -a wandb-api='s' -a batch-size=40
Training Weights & Biases Sweep 🧹
Weights and Biases integrated sweep block can automatically train a sweep for a super big set of hyperparameter including activation function, batch size, convolutional kernel, and optimizer type and we have included optimizer parameters. Sweeps are a really powerful way of finding the optimal training parameters.
Here we perform a random sweep and can see some unexpected results (Gelu) or Gaussian Error Linear Unit as the activation function provides the best validation accuracies. We can also see here which models learn through the training and model parameters.
Further, you can see that GFLOPs (Giga Floating Point Operations) is also associated with kernel size, (which is what you would expect) and that Adamax and Nadam optimizers provide slightly better results.
Run set
25
Follow the links in the User Guide section to learn more about Sweeps.
Validation Metrics
Run set
37
Training Metrics
Loss and Accuracy (vanilla) as part of Weights and Biases Keras Callback
Run set
61
Test Set (unseen data)
Related Reports
Using W&B to Monitor On Edge Device
Accuracy on ImageNet of Image Classification Models
Examining the Relationship between the number of trainable parameters and the accuracy of Image classification models trained on ImageNet benchmarking dataset.
Small Is Beautifull
Edge AI for Image Aesthetic Assesment
Add a comment