Pytorch Tutorial
Created on October 31|Last edited on September 30
Comment
TL;DR: Logging basic PyTorch models
In this project, I follow a wonderful tutorial on getting started with PyTorch from (yunjev) and instrument the examples with Weights & Biases, showing different ways to add logging, visualize training, and explore the effects of hyperparameters. The source tutorial features many examples split into three levels of difficulty. I've chosen three of these to demonstrate the approach: a basic feedforward net, a convolutional neural net, and a recurrent neural net. All these nets train on the golden standard of MNIST for simplicity. I hope to demonstrate how to conduct experiments with wandb and visualize the results in a clear and useful ways, with the eventual goal of building better, more explainable, and more theoretically sound models.
854
01: Effect of hidden layer size on basic feedforward net
02: Overfitting in RNNs
CNN Layer Size Combinations
Section 5
854
Add a comment