How Gradio and W&B Work Beautifully Together

Gradio lets you build a UI for your machine learning model. W&B lets you track it along with your experiments. Here’s how to use the libraries together, along with a live demo!.
Abubakar Abid

Introduction

Weights and Biases (W&B) allows data scientists and machine learning scientists to track their machine learning experiments at every stage, from training to production. Any metric can be aggregated over samples and shown in panels in a customizable and searchable dashboard, like below:
But while aggregate metrics are a good summary of a model’s performance, a complementary approach to understanding models is to allow users to interactively explore a model’s prediction on individual samples.
By dragging-and-dropping an image, writing text, recording audio, etc., interdisciplinary machine learning teams can test models and help discover biases and failure points that might be invisible in aggregate metrics. Often times, testing a model is done by non-coders, which means we need intuitive front ends like this one:
Building these demos often takes quite a lot of time since it involves creating a web-based front end and infrastructure for hosting and sharing the model. This is where Gradio comes in. The open-source Python library (with more than 2,800 stars) lets machine learning developers create demos and GUIs from models very easily, with just a few lines of Python. Here’s the code for image denoising demo like the one above:
import gradio as grinput = gr.inputs.Image(type='pil', label="Original Image")output = gr.outputs.Image(type='pil')gr.Interface(denoising_model, input, output).launch()
We’re excited to share that Gradio and W&B now work beautifully together! You can create a Gradio demo and include it in your W&B dashboard with just 1 extra line of code so colleagues and team members can understand and explore your model, regardless of their technical backgrounds!
Here’s how it works, from start to finish, in 7 quick steps. (If you’d like, refresh your knowledge of the basics of Gradio and the basics of W&B, first)

1. Create a W&B account

Follow instructions at https://app.wandb.ai/login to create your free account if you don’t have one already:

2. Install Gradio and W&B

Since both Gradio and W&B are open-source Python libraries, simply open up the environment where you are running Python and run in the terminal:
pip install gradio wandb

3. Login to to your W&B account

In the same terminal, run:
wandb login

4. Train a model or use a pretrained model

You can train a model from scratch and use Weights and Biases to track experiments and metrics. Or you can use a pretrained model. Here, we use a pretrained PyTorch image classifier , Resnet18.
import torch, requestsfrom torchvision import transformsfrom PIL import Imagemodel = torch.hub.load('pytorch/vision:v0.6.0', 'resnet18', pretrained=True).eval()# Download human-readable labels for ImageNet.response = requests.get("https://git.io/JJkYN")labels = response.text.split("\n")def predict(inp): inp = Image.fromarray(inp.astype('uint8'), 'RGB') inp = transforms.ToTensor()(inp).unsqueeze(0) with torch.no_grad(): prediction = torch.nn.functional.softmax(model(inp)[0], dim=0) return {labels[i]: float(prediction[i]) for i in range(1000)}

5. Create a Gradio Demo

In your Jupyter notebook or Python environment, create a Gradio demo, just as you would normally. When you launch(), make sure to set share=True.
import gradio as grinputs = gr.inputs.Image()outputs = gr.outputs.Label(num_top_classes=3)io = gr.Interface(fn=predict, inputs=inputs, outputs=outputs)io.launch(share=True)

6. Create a W&B Run

Create a W&B experiment, just as you would normally, by importing W&B and running an experiment:
import wandbwandb.init(project="your-test-project")

7. Integrate Gradio

The last step—integrating your Gradio demo with your W&B dashboard—is just one extra line:
io.integrate(wandb=wandb)

Live Demo

Once you call integrate, a demo will be created and you can integrate it into your dashboard or report, like this!