Skip to main content

Report 2019-11-08T15:58:16.135Z

Created on November 8|Last edited on November 8

TL;DR: Logging basic PyTorch models





In this project, I follow a wonderful tutorial on getting started with PyTorch from (yunjev) and instrument the examples with Weights & Biases, showing different ways to add logging, visualize training, and explore the effects of hyperparameters. The source tutorial features many examples split into three levels of difficulty. I've chosen three of these to demonstrate the approach: a basic feedforward net, a convolutional neural net, and a recurrent neural net. All these nets train on the golden standard of MNIST for simplicity. I hope to demonstrate how to conduct experiments with wandb and visualize the results in a clear and useful ways, with the eventual goal of building better, more explainable, and more theoretically sound models.

Run set
854


01: Effect of hidden layer size on basic feedforward net



02: Overfitting in RNNs



CNN Layer Size Combinations



Section 5




Run set
854