Autometa's group workspace
Group: gemma-7b
Name
10 visualized
Job Type: lm_eval
Job Type: lm_eval
7
Job Type: train-dpo
Job Type: train-dpo
2
Job Type: train-sft
Job Type: train-sft
1
State
Notes
User
Tags
Created
Runtime
Sweep
_name_or_path
accelerator_config.even_batches
accelerator_config.split_batches
accelerator_config.use_seedable_sampler
adafactor
adam_beta1
adam_beta2
adam_epsilon
add_cross_attention
architectures
attention_bias
attention_dropout
auto_find_batch_size
beta
bf16
bf16_full_eval
bos_token_id
chunk_size_feed_forward
cli_configs.batch_size
cli_configs.batch_sizes
cli_configs.bootstrap_iters
cli_configs.device
cli_configs.limit
cli_configs.model
cli_configs.model_args
dataloader_drop_last
dataloader_num_workers
dataloader_persistent_workers
dataloader_pin_memory
ddp_timeout
debug
disable_tqdm
diversity_penalty
do_eval
do_predict
do_sample
do_train
early_stopping
encoder_no_repeat_ngram_size
eos_token_id
eval_delay
eval_steps
evaluation_strategy
fp16
Finished
capecape
6d 8h 7m
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
1
-
100000
cuda:0
-
hf
["pretrained=/workspace/eval_harness/artifacts/gemma-7b:v0,dtype=bfloat16","pretrained=/workspace/eval_harness/artifacts/gemma-7b:v1,dtype=bfloat16","pretrained=google/gemma-7b,dtype=bfloat16"]
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Failed
capecape
1d 13h 47m 24s
-
/workspace/artifacts/gemma-7b:v0
true
false
true
false
0.9
0.999
1.0000e-8
false
GemmaForCausalLM
false
0
false
0.01
true
false
2
0
-
-
-
-
-
-
-
false
0
false
true
1800
-
false
0
true
false
false
false
false
0
1
0
100
steps
false
Finished
capecape
14h 43m 28s
-
google/gemma-7b
true
false
true
false
0.9
0.999
1.0000e-8
false
GemmaForCausalLM
false
0
false
-
true
false
2
0
-
-
-
-
-
-
-
false
0
false
true
1800
-
false
0
true
false
false
false
false
0
1
0
-
epoch
false
1-3
of 3