Reports
Created by
Created On
Last edited
Reformer: The Efficient Transformer
Our fast.ai community submission to the Reproducibility challenge 2020: "Reformer: The Efficient Transformer" by Nikita Kitaev, Łukasz Kaiser and Anselm Levskaya, accepted to ICLR 2020.
11
2021-01-28
Reformer Reproducibility - Long Draft
A fast.ai community submission to the Reproducibility challenge 2020: "Reformer: The Efficient Transformer" by Nikita Kitaev, Łukasz Kaiser and Anselm Levskaya, accepted to ICLR 2020.
0
2021-02-09
Reformer Reproducibility
Submission to Reproducibility challenge 2020: "Reformer: The Efficient Transformer" by Nikita Kitaev, Łukasz Kaiser and Anselm Levskaya, accepted to ICLR 2020.
0
2021-01-11
Extra: Shared/separate QK for the synthetic task
Comparing shared/separate qk transformer lm for the synthetic task
0
2021-01-28
Experiments: Hashing rounds
LSH attention performance as a function of hashing rounds number
0
2021-01-24
Experiments: Memory Consumption
We demonstrate how the memory allocation of the various reformer variants compare to the transformer during training
0
2021-01-27
Experiments: Synthetic task
Results from the synthetic task of the Reformer paper: https://arxiv.org/abs/2001.04451
1
2021-01-13
Experiments: Shared Query-Key Attention
We demonstrate how the memory allocation of the various reformer variants compare to the transformer during training
0
2021-01-27
Experiments: LSH-attention evaluation speed
We demonstrate how LSH evaluation speed changes with increased sequence lenghts.
0
2021-01-26
Methods: LSH
We'll give a brief explanation of LSH (locality sensing hashing) attention. For a step-by-step walk through of LSH-attention we refer to our project documentation: https://arampacha.github.io/reformer_fastai/exploration.LSH.html
0
2021-01-25