Ariannapera's group workspace
empathy#empathy_bin
What makes this group special?
Tags
llama-2-70b_balanced_size:1.0
Notes
Tags
round_2
Author
State
Finished
Start time
October 9th, 2023 12:24:17 PM
Runtime
3m 45s
Tracked hours
3m 37s
Run path
cocoons/crowdsourced_vs_gpt_datasize_v2/eowhytcn
OS
Linux-3.10.0-1062.18.1.el7.x86_64-x86_64-with-glibc2.2.5
Python version
3.8.2
Git repository
git clone git@github.com:AGMoller/worker_vs_gpt.git
Git state
git checkout -b "llama-2-70b_balanced_size1.0" 38d93efe3591823d0351c45d3ab664cc25fc945c
Command
-m src.worker_vs_gpt.datasize_experiment
System Hardware
| CPU count | 24 |
| Logical CPU count | 48 |
| GPU count | 1 |
| GPU type | Tesla V100-PCIE-32GB |
W&B CLI Version
0.14.2
Group
empathy#empathy_binConfig
Config parameters are your model's inputs. Learn more
- {} 192 keys▶
- "intfloat/e5-base"
- false
- 0.9
- 0.999
- 0.00000001
- false
- [] 1 item▶
- "BertModel"
- 0.1
- "llama-2-70b"
- false
- null
- 16
- null
- false
- false
- null
- 0
- "intfloat/e5-base"
- null
- null
- null
- false
- 0
- true
- null
- null
- null
- null
- 1,800
- [] 0 items
- null
- null
- false
- null
- 0
- true
- false
- false
- false
- false
- 0
- null
- null
- 0
- null
- "epoch"
- 30,522
- 0
- 0
- 0
46 ... 95▶▶96 ... 145▶▶146 ... 187▶▶
Summary
Summary metrics are your model's outputs. Learn more
- {} 24 keys▶
- "table-file"
- 0.9261744966442952
- 0.9259610606676604
- 0.3870871961116791
- 0.9683290862759376
- 0.3098
- 480.936
- 32.278
- 0.6290322580645161
- 0.6251204907258654
- 0.6842293739318848
- 0.6819259687680741
- 0.7058
- 263.548
- 17.003
- 10
- 950
- 0
- 0.0016
- 1,405,703,702,142,720
- 0.116439980362591
- 214.6255
- 70.448
- 4.426
Artifact Outputs
This run produced these artifacts as outputs. Total: 2. Learn more
Type
Name
Consumer count
Loading...