Ariannapera's group workspace
hypo-l
What makes this group special?
Tags
Notes
Tags
round_2
Author
State
Finished
Start time
October 9th, 2023 12:09:52 PM
Runtime
3m 31s
Tracked hours
3m 20s
Run path
cocoons/crowdsourced_vs_gpt_datasize_v2/r7trhjmy
OS
Linux-3.10.0-1062.18.1.el7.x86_64-x86_64-with-glibc2.2.5
Python version
3.8.2
Git repository
git clone git@github.com:AGMoller/worker_vs_gpt.git
Git state
git checkout -b "llama-2-70b_balanced_size1.0" 38d93efe3591823d0351c45d3ab664cc25fc945c
Command
-m src.worker_vs_gpt.datasize_experiment
System Hardware
| CPU count | 24 |
| Logical CPU count | 48 |
| GPU count | 1 |
| GPU type | Tesla V100-PCIE-32GB |
W&B CLI Version
0.14.2
Group
hypo-lConfig
Config parameters are your model's inputs. Learn more
- {} 192 keys▶
- "intfloat/e5-base"
- false
- 0.9
- 0.999
- 0.00000001
- false
- [] 1 item▶
- "BertModel"
- 0.1
- "llama-2-70b"
- false
- null
- 16
- null
- false
- false
- null
- 0
- "intfloat/e5-base"
- null
- null
- null
- false
- 0
- true
- null
- null
- null
- null
- 1,800
- [] 0 items
- null
- null
- false
- null
- 0
- true
- false
- false
- false
- false
- 0
- null
- null
- 0
- null
- "epoch"
- 30,522
- 0
- 0
- 0
46 ... 95▶▶96 ... 145▶▶146 ... 187▶▶
Summary
Summary metrics are your model's outputs. Learn more
- {} 24 keys▶
- "table-file"
- 0.9604743083003952
- 0.9604736907886516
- 0.3527873456478119
- 0.9952470293933708
- 0.4204
- 601.855
- 38.062
- 0.7120743034055728
- 0.6584342486781511
- 0.6011872887611389
- 0.7210392762577229
- 0.5984
- 539.741
- 35.092
- 10
- 1,610
- 0
- 0.0007
- 976,172,848,712,400
- 0.08400135565525997
- 198.3447
- 129.421
- 8.117
Artifact Outputs
This run produced these artifacts as outputs. Total: 2. Learn more
Type
Name
Consumer count
Loading...