Skip to main content

Learning Rate Sweep results

Exploring model fine tunning with different learning rates
Created on June 2|Last edited on June 2

Planets Dataset Learning Rate Sweeps

LeVit

A simple sweep over learning_rate

0.00200.00250.00300.00350.00400.00450.00500.00550.00600.00650.00700.00750.0080learning_ratelevit_128levit_128slevit_192levit_256levit_384model_nameNaN0.20.40.60.81.01.21.41.61.8valid_loss0.810.820.830.840.850.860.870.880.890.900.910.920.930.940.95accuracy_multi
Planets Sweep lr in [0.002, 0.008]
30

I think we can actually go higher with LeVit

This set of panels contains runs from a private project, which cannot be shown in this report

Something weird happens with the validation loss, needs more exploration. I think it's model related.

ResNets


Planets Sweep lr in [0.002, 0.008]
48



Vit


Planets Sweep lr in [0.002, 0.008]
60



RegNetx


Planets Sweep lr in [0.002, 0.008]
66


RegNety



Planets Sweep lr in [0.002, 0.008]
36


ConvNext



Planets Sweep lr in [0.002, 0.008]
29


Pets datasets learning rate Sweep

This is a bigger dataset, the results are not as clear as for Planets

LeVit



lr in [0.002, 0.008]
30

More LeVit: lr in [0,0.1]


lr in uniform
297


ResNets


lr in [0.002, 0.008]
42


Vit


lr in [0.002, 0.008]
60


RegNetx



lr in [0.002, 0.008]
63


RegNety



lr in [0.002, 0.008]
36


ConvNext



lr in [0.002, 0.008]
30


All