Skip to main content
fraser
Projects
conala
Reports
Snoop pretraining on CoNaLa
Log in
Sign up
Share
Comment
Star
Snoop pretraining on CoNaLa
Seeing how well pretraining gpt2 on 1 GB of python snoop traces of HackerRank problems.
Fraser Greenlee
Created on May 6
|
Last edited on May 6
Comment
Section 1
train_loss, eval_loss
train_loss, eval_loss
20
40
60
80
100
120
140
Step
2
4
6
8
gpt2_medium_from_scratch
train_loss
gpt2_medium_from_scratch
eval_loss
gpt2_medium_openai_pretrain
train_loss
gpt2_medium_openai_pretrain
eval_loss
gpt2_q1_10_btch_17_epoch
train_loss
gpt2_q1_10_btch_17_epoch
eval_loss
gpt2_q1_10_btch_35_epoch_masked_train
train_loss
gpt2_q1_10_btch_35_epoch_masked_train
eval_loss
eval_perplexity
eval_perplexity
20
40
60
80
100
120
140
Step
4
6
8
10
12
14
gpt2_medium_from_scratch
gpt2_medium_openai_pretrain
gpt2_q1_10_btch_17_epoch
gpt2_q1_10_btch_35_epoch_masked_train
Run set
5
Add a comment