Lecture notes on RNNs and Transformers
Lecture notes from the MIT 6.S191.
Created on March 24|Last edited on March 24
Comment
NLP powers a lot of the internet applications that we run everyday Transformers and Attention modules are the new undisputed State of the Art approaches to NLP techniques and beyond
Elvis Saravia, Technical PMM at Facebook has shared a crispy collection of notes from the MIT 6.S191.
Personal Recommendation: You can just get a lot of knowledge via the notes too and can skip watching the video if you’re short on time!
Topics Covered:
Deep sequence modelling
Intuition and Applications of RNNs
Drawbacks of RNNs
Self-Attention and Transformers
Add a comment
Tags: ML News
Iterate on AI agents and models faster. Try Weights & Biases today.