Skip to main content

Example bar chart

Created on January 6|Last edited on May 3

TL;DR: Logging basic PyTorch models

This project instruments PyTorch for Deep Learning Researchers by yunjev with Weights & Biases to show different ways to

  • add logging to a new python project
  • visualize training
  • explore the effects of hyperparameters

The source tutorial features many examples split into three levels of difficulty. I've chosen three to explore, all trained on MNIST (60K train, 10K test) for simplicity

  • basic feedforward net
  • convolutional neural net
  • recurrent neural net

The observations and insights below are based on a small number of noisy experiments with high variance. They are hypotheses or speculations based on past experience, meant to showcase what is possible with W&B and to inspire further exploration.

I hope to demonstrate how to conduct experiments with wandb and visualize the results in a clear and useful ways, with the eventual goal of building better, more explainable, and more theoretically sound models.

01: Effect of hidden layer size on basic feedforward net




Vary Hidden Layer Size
11


02: RNNs: Balance Generalization with Overfitting




1 Layer Count
27
2 Batch Size
27
3 Hidden Size
27