Upup-ashton-wang's group workspace
Tina-LIMR-16-LoRA-rank
What makes this group special?
Tags
Tina-LIMR-16-LoRA-rank
Notes
Author
State
Crashed
Start time
April 5th, 2025 6:02:35 PM
Runtime
8h 28m
Tracked hours
8h 26m 50s
Run path
upup-ashton-wang-usc/Tina/o2urtmt7
OS
Linux-5.15.0-92-generic-x86_64-with-glibc2.35
Python version
CPython 3.10.16
Command
/home/omer/shangshang/workspace/reasoning/reasoning-sae/./resee/post_train_hf/grpo.py --config ./recipes/DeepSeek-R1-Distill-Qwen-1.5B/grpo/model_curated_lima_medium_rank_ablation.yaml
System Hardware
CPU count | 32 |
Logical CPU count | 64 |
GPU count | 8 |
GPU type | NVIDIA RTX 6000 Ada Generation |
W&B CLI Version
0.19.8
Config
Config parameters are your model's inputs. Learn more
- {} 225 keys▶
- true
- "/home/omer/shangshang/project/reasoning/reasoning-sae/ckpts/models/DeepSeek-R1-Distill-Qwen-1.5B/base"
- {} 6 keys▶
- false
- 0.9
- 0.999
- 0.00000001
- false
- [] 1 item▶
- "Qwen2ForCausalLM"
- 0
- false
- false
- null
- false
- null
- 0.04
- true
- false
- 151,643
- 0
- null
- null
- false
- 0
- false
- true
- null
- null
- null
- null
- null
- 1,800
- [] 0 items
- null
- null
- false
- null
- 0
- false
- false
- false
- false
- true
- false
- 0
- 151,643
- 151,936
- 0.1
- 0
- 0
46 ... 95▶▶96 ... 145▶▶146 ... 195▶▶196 ... 220▶▶
Summary
Summary metrics are your model's outputs. Learn more
- {} 12 keys▶
- "table-file"
- 1,685.34375
- 1.725179856115108
- 299
- 0.04161066189408302
- 0.00005090236663818359
- 0.00000017644580628792
- 0
- 1.84375
- 0.6540063619613647
- 0.46875
- 0.90625
Artifact Outputs
This run produced these artifacts as outputs. Total: 6. Learn more