Autometa's group workspace
Group: gemma-2b
Name
3 visualized
Job Type: lm_eval
Job Type: lm_eval
1
Job Type: train-dpo
Job Type: train-dpo
1
Job Type: train-sft
Job Type: train-sft
1
State
Notes
User
Tags
Created
Runtime
Sweep
_name_or_path
accelerator_config.even_batches
accelerator_config.split_batches
accelerator_config.use_seedable_sampler
adafactor
adam_beta1
adam_beta2
adam_epsilon
add_cross_attention
architectures
attention_bias
attention_dropout
auto_find_batch_size
beta
bf16
bf16_full_eval
bos_token_id
chunk_size_feed_forward
cli_configs.batch_size
cli_configs.batch_sizes
cli_configs.bootstrap_iters
cli_configs.device
cli_configs.limit
cli_configs.model
cli_configs.model_args
dataloader_drop_last
dataloader_num_workers
dataloader_persistent_workers
dataloader_pin_memory
ddp_timeout
debug
disable_tqdm
diversity_penalty
do_eval
do_predict
do_sample
do_train
early_stopping
encoder_no_repeat_ngram_size
eos_token_id
eval_delay
eval_steps
evaluation_strategy
fp16
Finished
capecape
1h 34m 33s
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
1
-
100000
cuda:0
-
hf
pretrained=google/gemma-2b,dtype=bfloat16
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Finished
capecape
10h 53m 6s
-
/workspace/artifacts/gemma-2b:v0
true
false
true
false
0.9
0.999
1.0000e-8
false
GemmaForCausalLM
false
0
false
0.01
true
false
2
0
-
-
-
-
-
-
-
false
0
false
true
1800
-
false
0
true
false
false
false
false
0
1
0
100
steps
false
Finished
capecape
2h 37m 50s
-
google/gemma-2b
true
false
true
false
0.9
0.999
1.0000e-8
false
GemmaForCausalLM
false
0
false
-
true
false
2
0
-
-
-
-
-
-
-
false
0
false
true
1800
-
false
0
true
false
false
false
false
0
1
0
-
epoch
false
1-3
of 3