Skip to main content

Arampacha's group workspace

IMDB-roberta-base-hard-plabels-lr2e-05

What makes this group special?
Tags

imdb-roberta-base-hard-plabels

Notes

Simple finetuning roberta-base with RAdam lr=2e-05

Tags
imdb
plabels
radam
roberta-base
unsup
Author
State
Crashed
Start time
April 26th, 2021 7:33:07 PM
Runtime
6h 36m 54s
Tracked hours
5h 36m 43s
Run path
fastai_community/vat/213pa8rj
OS
Linux-4.19.112+-x86_64-with-Ubuntu-18.04-bionic
Python version
3.7.10
Command
_self_training_imdb-roberta-alum.ipynb
System Hardware
CPU count2
GPU count1
GPU typeTesla T4
W&B CLI Version
0.10.27
Config

Config parameters are your model's inputs. Learn more

  • {} 20 keys
    • {} 3 keys
      • 0.7
      • "RobertaEmbeddings( (word_embeddings): Embedding(50265, 768, padding_idx=1) (position_embeddings): Embedding(514, 768, padding_idx=1) (token_type_embeddings): Embedding(1, 768) (LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) )"
      • 1
    • 9,118
    • 8
    • "[Pipeline: ItemGetter -> read_text, Pipeline: ItemGetter -> Categorize -- {'vocab': None, 'sort': True, 'add_na': False}]"
    • "cuda"
    • "Pipeline: HF_AfterBatchTransform"
    • "Pipeline: ToTensor"
    • "Pipeline: HF_BeforeBatchTransform"
    • false
    • 0
    • true
    • {} 12 keys
      • true
      • 124,647,170
      • 1
      • true
      • true
      • {} 3 keys
        • true
        • false
        • true
      • true
      • {} 9 keys
      Summary

      Summary metrics are your model's outputs. Learn more

      • {} 24 keys
        • 0.9576399922370912
        • 0
        • 0
        • 0
        • 4
        • 0.00001
        • 0.00001
        • 0.00001
        • {} 4 keys
          • "graph-file"
          • "media/graph/graph_0_summary_dcf71b03.graph.json"
          • "dcf71b03c15fab56b0a0e92e55ffdc8a533874ace48042399e07896a7fa4ccf7"
          • 28,208
        • 0.00000000020006587893
        • 0.00000000020006587893
        • 0.00000000020006587893
        • 0.949999999670602
        • 0.949999999670602
        • 0.949999999670602
        • 0.02139529399573803
        • 0.99
        • 0.99
        • 0.99
        • 0.032125506550073624
        • 0.12217216938734056
        • 0
        • 0
        • 0