Skip to main content

Dk-crazydiv's group workspace

bbca_new

What makes this group special?
Tags

roberta-pretraining-hindi_mc4-grouptext-epoch6

Notes
State
Crashed
Start time
July 13th, 2021 3:50:47 PM
Runtime
1h 24m 53s
Tracked hours
4m 31s
Run path
wandb/hf-flax-roberta-hindi/1gp0u2q6
OS
Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic
Python version
3.7.10
Command
BBCA Downstream Task.ipynb
System Hardware
CPU count2
GPU count1
GPU typeTesla V100-SXM2-16GB
W&B CLI Version
0.10.33
Config

Config parameters are your model's inputs. Learn more

  • {} 140 keys
    • 1
    • "flax-community/roberta-pretraining-hindi"
    • false
    • 0.9
    • 0.999
    • 0.00000001
    • false
    • [] 1 item
      • "RobertaForMaskedLM"
    • 0.1
    • null
    • 0
    • 0
    • false
    • 0
    • true
    • "None"
    • "[]"
    • null
    • "None"
    • false
    • 0
    • true
    • false
    • false
    • false
    • false
    • 0
    • 2
    • "None"
    • 8
    • 500
    • "epoch"
    • null
    • null
    • null
    • false
    • "auto"
    • false
    • "O1"
    • 1
    • false
    • "None"
    • false
    • "gelu"
    • 0.1
    • 768
    • 46 ... 95
      96 ... 135
    • 50,265
    • 0
    • 0
    • 0
Summary

Summary metrics are your model's outputs. Learn more

  • {} 18 keys
    • 0.6951501154734411
    • 0.6852204287183785
    • 1.465360403060913
    • 0.6848761421374582
    • 0.6951501154734411
    • 2.5427
    • 340.586
    • 42.868
    • {} 4 keys
      • "graph-file"
      • "media/graph/graph_0_summary_1444cdd95f1458c12039.graph.json"
      • "1444cdd95f1458c1203966953c270b101d6814e841d79afc9726b4cf62d4f808"
      • 26,108
    • 5
    • 2,170
    • 0.00000391705069124424
    • 0.2084
    • 1,659,585,530,365,440
    • 0.5535722451275944
    • 210.8951
    • 82.197
    • 10.289
Artifact Outputs

This run produced these artifacts as outputs. Learn more

Loading...