Skip to main content

Kojima-takeshi188's group workspace

kakuta-2052-gpt2-large-lr0.01-iter600

What makes this group special?
Tags

kakuta-2052-gpt2-large-lr0.01-iter600

Notes
Author
State
Failed
Start time
March 1st, 2024 11:52:23 AM
Runtime
2m 38s
Tracked hours
2m 29s
Run path
weblab_lecture/vip_soutyou/2qiszpgk
OS
Linux-4.18.0-193.el8.x86_64-x86_64-with-glibc2.35
Python version
3.10.6
Git repository
git clone git@github.com:matsuolab/llm-vip-seminar.git
Git state
git checkout -b "kakuta-2052-gpt2-large-lr0.01-iter600" 7024a87bd57b7ad64a9284520d10f3ab1d3fc327
Command
/home/acc12029sa/llm-vip-seminar-vl04/notebooks/llm/train_speech.py --your_name kakuta --model_name gpt2-large --total_iteration_num 600 --learning_rate 0.01
System Hardware
CPU count72
Logical CPU count 144
GPU count8
GPU typeNVIDIA A100-SXM4-40GB
W&B CLI Version
0.15.8
Config

Config parameters are your model's inputs. Learn more

No config parameters were saved for this run.

Check the configuration documentation for more information.

Summary

Summary metrics are your model's outputs. Learn more

  • {} 3 keys
    • 0.01
    • NaN
    • "table-file"
Artifact Inputs

This run consumed these artifacts as inputs. Learn more