Morgan's workspace
Runs
5
Name
1 visualized
State
Created
Runtime
train_loss
epoch
batch size
model parameters
Crashed
11d 6h 28m 37s
1.34629
50
256
19047424
Finished
1d 18h 39m 55s
2.27182
50
256
19047424
Finished
37m 36s
NaN
50
256
19047424
Finished
30m 17s
NaN
50
256
19047424
Finished
19m 41s
1.09208
50
256
19047424
Notes
User
Tags
Sweep
ActivationStats.cpu
ActivationStats.detach
ActivationStats.every
ActivationStats.is_forward
ActivationStats.remove_end
GradientClip.max_norm
GradientClip.norm_type
Learner._name
Learner.loss_func._name
Learner.loss_func.axis
Learner.loss_func.flatten
Learner.loss_func.floatify
Learner.loss_func.is_2d
Learner.lr
Learner.metrics
Learner.model_dir
Learner.moms
Learner.opt_func
Learner.path
Learner.splitter
Learner.train_bn
Learner.wd_bn_bias
MixedPrecision
ParamScheduler
ProgressCallback
Recorder.add_time
Recorder.train_metrics
Recorder.valid_metrics
SaveModelCallback.every_epoch
SaveModelCallback.fname
SaveModelCallback.with_opt
TrainEvalCallback
WandbCallback.log
WandbCallback.log_dataset
WandbCallback.log_model
WandbCallback.log_preds
WandbCallback.n_preds
WandbCallback.reorder
WandbCallback.seed
batch per epoch
dataset.tfms
device
dls.after_batch
dls.after_item
dls.before_batch
frozen idx
frozen
Track gpt baseline performance
morgan
test
-
true
true
25
true
true
-
-
<fastai.learner.Learner object at 0x7f4222f99a60>
FlattenedLoss of CrossEntropyLoss()
-1
true
false
true
0.001
-
models
[0.95,0.85,0.95]
functools.partial(<function Adam at 0x7f4225e12670>, sqr_mom=0.95, wd=0.1)
.
fastai.torch_core.trainable_params
true
false
true
true
true
true
false
true
-
-
-
true
gradients
false
false
false
36
true
12345
35
Pipeline: CharTransform -- {}
cuda
Pipeline:
Pipeline:
Pipeline:
0
false
Track gpt baseline performance
morgan
test
-
true
true
25
true
true
-
-
<fastai.learner.Learner object at 0x7f1c79927040>
FlattenedLoss of CrossEntropyLoss()
-1
true
false
true
0.001
-
models
[0.95,0.85,0.95]
functools.partial(<function Adam at 0x7f1c7c779670>, sqr_mom=0.95, wd=0.1)
.
fastai.torch_core.trainable_params
true
false
true
true
true
true
false
true
-
-
-
true
gradients
false
false
false
36
true
12345
35
Pipeline: CharTransform -- {}
cuda
Pipeline:
Pipeline:
Pipeline:
0
false
Track gpt baseline performance
morgan
test
-
true
true
25
true
true
-
-
<fastai.learner.Learner object at 0x7f50f071a460>
FlattenedLoss of CrossEntropyLoss()
-1
true
false
true
0.001
-
models
[0.95,0.85,0.95]
functools.partial(<function Adam at 0x7f50f59ce700>, sqr_mom=0.95, wd=0.1)
.
fastai.torch_core.trainable_params
true
false
true
true
true
true
false
true
-
-
-
true
gradients
false
false
false
36
true
12345
35
Pipeline: CharTransform -- {}
cuda
Pipeline:
Pipeline:
Pipeline:
0
false
Track gpt baseline performance
morgan
test
-
true
true
25
true
true
-
-
<fastai.learner.Learner object at 0x7fe2eede2b50>
FlattenedLoss of CrossEntropyLoss()
-1
true
false
true
0.001
-
models
[0.95,0.85,0.95]
functools.partial(<function Adam at 0x7fe301c865e0>, sqr_mom=0.95, wd=0.1)
.
fastai.torch_core.trainable_params
true
false
true
true
true
true
false
true
-
-
-
true
gradients
false
false
false
36
true
12345
35
Pipeline: CharTransform -- {}
cuda
Pipeline:
Pipeline:
Pipeline:
0
false
Track gpt baseline performance
morgan
test
-
true
true
25
true
true
-
-
<fastai.learner.Learner object at 0x7f68551a5d90>
FlattenedLoss of CrossEntropyLoss()
-1
true
false
true
0.001
-
models
[0.95,0.85,0.95]
functools.partial(<function Adam at 0x7f674f86a670>, sqr_mom=0.95, wd=0.1)
.
fastai.torch_core.trainable_params
true
false
true
true
true
true
false
true
-
-
-
true
gradients
false
false
false
36
true
12345
35
Pipeline: CharTransform -- {}
cuda
Pipeline:
Pipeline:
Pipeline:
0
false
1-5
of 5