Skip to main content

W&B Study Group Lectures: fast.ai w/ Hugging Face

A Hugging Face course study group for fast.ai developers looking to leverage fast.ai to train and deploy Transformers.
Created on July 12|Last edited on August 4

Register for the remaining live lectures!

🎉 Participate in our competition 🎉

3 winners per competition - compete for:
💡
  1. Best Blog Post on Training Transformers with fastai
  2. Best Results On SST-2 (The Stanford Sentiment Treebank)
  3. Most Interesting Application of a fastai Trained Transformer
Register for the event & submit to wgilliam@ohmeow.com by Tuesday, August 3, 12pm PT
💡



Recap of Lecture 1

In the first lecture of the W&B x fast.ai x Hugging Face Study Group Wayde Gilliam made sure you:
  • Can run the code in these sections in colab and
  • Understand why transformers work well for a myriad of NLP tasks and how they work
(Optional) homework till Sunday, July 18, 10am PT / 7pm CET / 10:30pm IST: read the "All you need is attention" paper by Vaswani et al. and "The Illustrated Transformer" by Jay Alammar.

Resources: Wayde's slides | discord forum | Lecture 1 Colab notebook





Recap of Lecture 2

In the second lecture of the W&B x fast.ai x Hugging Face Study Group Wayde Gilliam made sure you:
  • Understand how Hugging Face's high-level pipeline works under the covers, including how to preprocess your inputs with a tokenizer, run your inputs through a model, and finally postprocess the outputs the same aforementioned tokenizer.
  • Take an initial look at Blurr, and in particular how it handles the tokenization and models in a fastai world.
(Optional) homework till Sunday, July 25, 10am PT / 7pm CET / 10:30pm IST:
  • Watch the official course videos from week 2
  • Blog about something you’ve learned from week 1 & 2. See if you can include some actual code using pure Hugging Face and/or one of the fastai integration libraries.
  • Re-read the “Attention is all you need” paper by Vaswani et al. and optionally pick an architecture you're curious about, the paper, and give it a read (use discord to ask any questions about it)
  • 🎉 Get ready for some competitions to be announced next week!

Resources: Wayde's slides | discord forum | Lecture 2 Colab notebook





Recap of Lecture 3:

In this session of the W&B x fast.ai x Hugging Face Study Group Wayde Gilliam, Zach Mueller and Arto covered:
  • Understand how to fine-tune Hugging Face transformers with fastai (Wayde)
  • AdaptNLP & fastai: finding and filling gaps between transformers (Zach)
  • How to use Fasthugs (Arto)
(Optional) homework till Sunday, August 1:
  • Rewatch recording of lecture #3
  • Enter our competition 🎉 (see details on top of this report)

Resources: Wayde's slides + Colab | Zach's slides | Arto's slides + Colab

Join the discord forum!





Recap of Lecture 4:

In this session of the W&B x fast.ai x Hugging Face Study Group Wayde Gilliam discussed:
  • Understand how to navigate the Hub
  • Share your own models
(Optional) homework till Sunday, August 8:
  • Enter our competition 🎉 (see details on top of this report)

Resources: Wayde's slides

Join the discord forum!



👉Register for Demo Day, next Sunday, Aug 8 to watch the winners! http://wandb.me/fastai-hf