Learning Rate Sweep results
Exploring model fine tunning with different learning rates
Created on June 2|Last edited on June 2
Comment
Planets Dataset Learning Rate Sweeps
LeVit
A simple sweep over learning_rate
Planets Sweep lr in [0.002, 0.008]
30
I think we can actually go higher with LeVit
This set of panels contains runs from a private project, which cannot be shown in this report
Something weird happens with the validation loss, needs more exploration. I think it's model related.
ResNets
Planets Sweep lr in [0.002, 0.008]
48
Vit
Planets Sweep lr in [0.002, 0.008]
60
RegNetx
Planets Sweep lr in [0.002, 0.008]
66
RegNety
Planets Sweep lr in [0.002, 0.008]
36
ConvNext
Planets Sweep lr in [0.002, 0.008]
29
Pets datasets learning rate Sweep
This is a bigger dataset, the results are not as clear as for Planets
LeVit
lr in [0.002, 0.008]
30
More LeVit: lr in [0,0.1]
lr in uniform
297
ResNets
lr in [0.002, 0.008]
42
Vit
lr in [0.002, 0.008]
60
RegNetx
lr in [0.002, 0.008]
63
RegNety
lr in [0.002, 0.008]
36
ConvNext
lr in [0.002, 0.008]
30
All
Add a comment