Parambharat's group workspace
small-te
What makes this group special?
Tags
bumbling-totem-62
Notes
Author
State
Finished
Start time
December 13th, 2022 8:08:58 AM
Runtime
1d 6h 43m 41s
Tracked hours
19h 50m 26s
Run path
parambharat/whisper_finetuning/28l7467b
OS
Linux-5.4.0-125-generic-x86_64-with-glibc2.17
Python version
3.8.15
Git repository
git clone https://github.com/parambharat/whisper-finetuning.git
Git state
git checkout -b "bumbling-totem-62" 72affd4fb13fa64264e65c8f246f84359e63fd45
Command
whisper_small_te.ipynb
System Hardware
| CPU count | 16 |
| Logical CPU count | 16 |
| GPU count | 1 |
| GPU type | NVIDIA A100-PCIE-40GB |
W&B CLI Version
0.13.6
Group
small-teJob Type
fine-tuning
Config
Config parameters are your model's inputs. Learn more
- {} 189 keys▶
- "./artifacts/model-28l7467b:v2"
- 0
- "gelu"
- false
- 0.9
- 0.999
- 0.00000001
- false
- [] 1 item▶
- "WhisperForConditionalGeneration"
- 0
- false
- null
- [] 2 items▶
- 220
- 50,257
- false
- false
- 50,257
- 0
- null
- 768
- "None"
- false
- 0
- true
- "None"
- "None"
- 1,800
- "[]"
- 12
- 3,072
- 0
- 12
- 50,258
- "None"
- false
- 0
- true
- false
- false
- false
- 0
- false
- 12
- 3,072
- 0
- 12
- 0
- 0
- 500
- 0
- "None"
46 ... 95▶▶96 ... 145▶▶146 ... 184▶▶
Summary
Summary metrics are your model's outputs. Learn more
- {} 15 keys▶
- 0.18625915050506592
- 163.3261
- 0.612
- 0.024
- 31.64556962025317
- "table-file"
- 1.35
- 5,000
- 0.00000000444444444444
- 0.1002
- 92,344,442,248,396,800,000
- 0.12562131462097167
- 70,959.925
- 4.51
- 0.07
Artifact Outputs
This run produced these artifacts as outputs. Total: 42. Learn more
Type
Name
Consumer count
Loading...