Skip to main content

Arampacha's group workspace

imdb-roberta-base-hf-3e-05

What makes this group special?
Tags

imdb-roberta-base-alum

Notes

HF finetuning roberta-base with AdamW lr=3e-05

Tags
adamw
alum
imdb
roberta-base
Author
State
Finished
Start time
April 22nd, 2021 4:31:50 PM
Runtime
2h 6m 43s
Tracked hours
1h 16m 55s
Run path
fastai_community/vat/1ou29fny
OS
Linux-4.15.0-123-generic-x86_64-with-debian-buster-sid
Python version
3.7.6
Command
<python with no main file> -f /workspace/.local/share/jupyter/runtime/kernel-37a4d256-31ea-47e5-a11e-7b3f699345fd.json
System Hardware
CPU count56
GPU count1
GPU typeTesla V100S-PCIE-32GB
W&B CLI Version
0.10.12
Config

Config parameters are your model's inputs. Learn more

  • {} 130 keys
    • 1
    • "roberta-base"
    • false
    • 0.9
    • 0.999
    • 0.00000001
    • false
    • [] 1 item
      • "RobertaForMaskedLM"
    • 0.1
    • null
    • 0
    • 0
    • false
    • 4
    • true
    • "None"
    • false
    • null
    • "None"
    • false
    • 0
    • "None"
    • false
    • false
    • false
    • false
    • 0
    • 2
    • "None"
    • 64
    • 200
    • "epoch"
    • null
    • null
    • null
    • true
    • "auto"
    • false
    • "O1"
    • 1
    • false
    • "None"
    • true
    • "gelu"
    • 0.1
    • 768
    • 46 ... 95
      96 ... 125
    • 50,265
    • 0.2
    • 0
    • 0
Summary

Summary metrics are your model's outputs. Learn more

  • {} 11 keys
    • 0.958
    • 0.1252136081457138
    • 249.8749
    • 100.05
    • 5
    • 7,815
    • 0.00000000042609214894
    • 0.2149
    • 25,418,934,405,540,960
    • 4,539.5106
    • 1.722
Artifact Outputs

This run produced these artifacts as outputs. Total: 1. Learn more

Loading...