Skip to main content
levmckinney
Projects
tuned-lens
Log in
Sign up
Overview
Workspace
Runs
Automat.
Sweeps
Reports
Artifacts
Levmckinney's workspace
Personal workspace
Automated workspace
Changes are only visible to you.
Runs
46
Name
5 visualized
Group: meta-llama/Meta-Llama-3-8B
Group: meta-llama/Meta-Llama-3-8B
0
1
Group: meta-llama/Meta-Llama-3-8B-Instruct
Group: meta-llama/Meta-Llama-3-8B-Instruct
0
1
Group: huggyllama/llama-7b
Group: huggyllama/llama-7b
5
7
Group: EleutherAI/pythia-410m-deduped
Group: EleutherAI/pythia-410m-deduped
6
9
Group: meta-llama/Llama-2-7b-chat-hf
Group: meta-llama/Llama-2-7b-chat-hf
0
2
Group: meta-llama/Llama-2-7b-hf
Group: meta-llama/Llama-2-7b-hf
0
3
Group: meta-llama/Llama-2-13b-chat-hf
Group: meta-llama/Llama-2-13b-chat-hf
0
2
Group: meta-llama/Llama-2-13b-hf
Group: meta-llama/Llama-2-13b-hf
0
2
Group: EleutherAI/pythia-70m
Group: EleutherAI/pythia-70m
0
1
Group: llama-7b
Group: llama-7b
0
2
Group: pythia-2.8b-deduped-v0
Group: pythia-2.8b-deduped-v0
0
2
Group: pythia-12b-deduped
Group: pythia-12b-deduped
0
1
Group: pythia-6.9b-deduped
Group: pythia-6.9b-deduped
0
1
Group: pythia-2.8b-deduped
Group: pythia-2.8b-deduped
0
2
Group: pythia-1.4b-deduped
Group: pythia-1.4b-deduped
0
1
Group: pythia-410m-deduped
Group: pythia-410m-deduped
0
1
Group: pythia-70m-deduped
Group: pythia-70m-deduped
0
7
Group: pythia-160m-deduped
Group: pythia-160m-deduped
0
1
1-18
of 18
Add panels
bias_norm
32
1-6 of 32
bias_norm/8.ffn
bias_norm/8.ffn
0
50
100
150
200
Step
0.2
0.4
0.6
0.8
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
bias_norm/7.ffn
bias_norm/7.ffn
0
50
100
150
200
Step
0.2
0.4
0.6
0.8
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
bias_norm/18.ffn
bias_norm/18.ffn
0
50
100
150
200
Step
0.1
0.2
0.3
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
bias_norm/15.ffn
bias_norm/15.ffn
0
50
100
150
200
Step
0.1
0.2
0.3
0.4
0.5
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
bias_norm/5.ffn
bias_norm/5.ffn
0
50
100
150
200
Step
0.2
0.4
0.6
0.8
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
bias_norm/30.ffn
bias_norm/30.ffn
0
50
100
150
200
Step
0.01
0.02
0.03
0.04
0.05
group: meta-llama/Meta-Llama-3-8B, w_kl: -, w_ce: -
group: meta-llama/Meta-Llama-3-8B-Instruct, w_kl: -, w_ce: -
group: meta-llama/Llama-2-7b-hf, w_kl: -, w_ce: -
loss
32
1-6 of 32
weight_norm
32
1-6 of 32
Add section