Dk-crazydiv's group workspace
bbca
What makes this group special?
Tags
roberta-pretraining-hindi_mc4_epoch8
Notes
Author
State
Crashed
Start time
July 12th, 2021 3:25:11 PM
Runtime
1h 26m 15s
Tracked hours
6m 5s
Run path
wandb/hf-flax-roberta-hindi/owy2v1a5
OS
Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic
Python version
3.7.10
Command
BBCA downstream indic-glued.ipynb
System Hardware
CPU count | 2 |
GPU count | 1 |
GPU type | Tesla P100-PCIE-16GB |
W&B CLI Version
0.10.33
Group
bbcaConfig
Config parameters are your model's inputs. Learn more
- {} 140 keys▶
- 1
- "flax-community/roberta-pretraining-hindi"
- false
- 0.9
- 0.999
- 0.00000001
- false
- [] 1 item▶
- "RobertaForMaskedLM"
- 0.1
- null
- 0
- 0
- false
- 0
- true
- "None"
- "[]"
- null
- "None"
- false
- 0
- true
- false
- false
- false
- false
- 0
- 2
- "None"
- 8
- 500
- "epoch"
- null
- null
- null
- false
- "auto"
- false
- "O1"
- 1
- false
- "None"
- false
- "gelu"
- 0.1
- 768
- 50,265
- 0
- 0
- 0
46 ... 95▶▶96 ... 135▶▶
Summary
Summary metrics are your model's outputs. Learn more
- {} 15 keys▶
- 0.7113163972286374
- 1.4850834608078003
- 4.0404
- 214.337
- 26.978
- {} 4 keys▶
- "graph-file"
- "media/graph/graph_0_summary_2c2660078f548cff6d21.graph.json"
- "2c2660078f548cff6d2103e6612a309278d610e3011486d600e24ecb1fef9dce"
- 26,108
- 5
- 2,170
- 0.00000391705069124424
- 0.1997
- 1,659,585,530,365,440
- 0.5592170223113029
- 323.9097
- 53.518
- 6.699
Artifact Outputs
This run produced these artifacts as outputs. Learn more
Type
Name
Consumer count
Loading...