Dk-crazydiv's group workspace
Group: testing_wikiner
Name
7 visualized
State
Notes
User
Tags
Created
Runtime
Sweep
_n_gpu
_name_or_path
adafactor
adam_beta1
adam_beta2
adam_epsilon
add_cross_attention
architectures
attention_probs_dropout_prob
bos_token_id
chunk_size_feed_forward
classifier_dropout_prob
dataloader_drop_last
dataloader_num_workers
dataloader_pin_memory
ddp_find_unused_parameters
debug
deepspeed
disable_tqdm
diversity_penalty
do_eval
do_predict
do_sample
do_train
down_scale_factor
early_stopping
embedding_size
encoder_no_repeat_ngram_size
eos_token_id
eval_accumulation_steps
eval_batch_size
eval_steps
evaluation_strategy
fp16
fp16_backend
fp16_full_eval
fp16_opt_level
gap_size
gradient_accumulation_steps
gradient_checkpointing
greater_is_better
group_by_length
hidden_act
hidden_dropout_prob
Crashed
Add notes...
hassi_ahk
2h 48s
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Crashed
Add notes...
hassi_ahk
1h 13m 37s
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Finished
Add notes...
hassi_ahk
2m 59s
-
1
mrm8488/HindiBERTa
false
0.9
0.999
1.0000e-8
false
["RobertaForMaskedLM"]
0.1
0
0
-
false
0
true
None
[]
None
false
0
true
false
false
false
-
false
-
0
2
None
8
500
epoch
false
auto
false
O1
-
1
false
None
false
gelu
0.1
Finished
Add notes...
hassi_ahk
5m 25s
-
1
flax-community/roberta-pretraining-hindi
false
0.9
0.999
1.0000e-8
false
["RobertaForMaskedLM"]
0.1
0
0
-
false
0
true
None
[]
None
false
0
true
false
false
false
-
false
-
0
2
None
8
500
epoch
false
auto
false
O1
-
1
false
None
false
gelu
0.1
Finished
Add notes...
hassi_ahk
5m 23s
-
1
flax-community/roberta-pretraining-hindi
false
0.9
0.999
1.0000e-8
false
["RobertaForMaskedLM"]
0.1
0
0
-
false
0
true
None
[]
None
false
0
true
false
false
false
-
false
-
0
2
None
8
500
epoch
false
auto
false
O1
-
1
false
None
false
gelu
0.1
Finished
Add notes...
hassi_ahk
5m 17s
-
1
flax-community/roberta-pretraining-hindi
false
0.9
0.999
1.0000e-8
false
["RobertaForMaskedLM"]
0.1
0
0
-
false
0
true
None
[]
None
false
0
true
false
false
false
-
false
-
0
2
None
8
500
epoch
false
auto
false
O1
-
1
false
None
false
gelu
0.1
Finished
Add notes...
hassi_ahk
5m 31s
-
1
flax-community/roberta-pretraining-hindi
false
0.9
0.999
1.0000e-8
false
["RobertaForMaskedLM"]
0.1
0
0
-
false
0
true
None
[]
None
false
0
true
false
false
false
-
false
-
0
2
None
8
500
epoch
false
auto
false
O1
-
1
false
None
false
gelu
0.1
1-7
of 7