Skip to main content

Levmckinney's group workspace

llama-7b

What makes this group special?
Tags

huggyllama/llama-7b-1683483775

Notes
State
Finished
Start time
May 7th, 2023 6:30:07 PM
Runtime
2h 51m 51s
Tracked hours
2h 51m 41s
Run path
levmckinney/tuned-lens/y29i9n4d
OS
Linux-5.4.0-148-generic-x86_64-with-glibc2.35
Python version
3.10.6
Command
-m tuned_lens.__main__ train --model.name huggyllama/llama-7b --data.name /datasets/val.jsonl --per_gpu_batch_size=1 --output /output/huggyllama/llama-7b-1683483775 --slow_tokenizer --wandb huggyllama/llama-7b-1683483775 --fsdp
System Hardware
CPU count128
Logical CPU count 255
GPU count4
GPU typeNVIDIA A100-SXM4-80GB
W&B CLI Version
0.15.2
Config

Config parameters are your model's inputs. Learn more

  • {} 17 keys
    • false
    • {} 5 keys
      • {} 2 keys
        • false
        • true
      • null
      • "LossChoice.KL"
      • {} 5 keys
        • 250
        • {} 6 keys
          • "/output/huggyllama/llama-7b-1683483775"
          • 1
          • false
          • 42
          • false
          • null
          • 262,144
          • "huggyllama/llama-7b-1683483775"
          • false
        Summary

        Summary metrics are your model's outputs. Learn more

        • {} 96 keys
          • 0.549409031867981
          • 0.4864175021648407
          • 0.3815529644489288
          • 0.337003618478775
          • 0.33466318249702454
          • 0.2772126793861389
          • 0.2483600378036499
          • 0.23252029716968536
          • 0.2142762988805771
          • 0.20198982954025269
          • 0.17954114079475403
          • 0.17352469265460968
          • 0.15591587126255035
          • 0.13548873364925385
          • 0.12539741396903992
          • 0.11465544998645782
          • 0.10790852457284927
          • 0.10376321524381638
          • 0.09996820986270905
          • 0.09624125808477402
          • 0.09406394511461258
          • 0.09122977405786514
          • 0.0866076871752739
          • 0.0783386379480362
          • 0.06940832734107971
          • 0.05777710676193237
          • 0.04815864935517311
          • 0.03800032287836075
          • 0.025014115497469906
          • 0.016188008710741997
          • 0.010119954124093056
          • 1.4453942775726318
          • 1.9336254596710205
          • 1.7589058876037598
          • 1.6618731021881104
          • 1.5882800817489624
          • 1.5341249704360962
          • 1.5004748106002808
          • 1.3935556411743164
          • 1.390508770942688
          • 1.3534235954284668
          • 1.3129788637161257
          • 1.262107253074646
          • 1.259196162223816
          • 1.23857319355011
          • 1.1651469469070437
          • 46 ... 91
          • 1.7927353382110596
          • 1.5187828540802002
          • 1.2240056991577148
          • 3.041059732437134
        Artifact Inputs

        This run consumed these artifacts as inputs. Learn more

        Artifact Outputs

        This run produced these artifacts as outputs. Total: 1. Learn more

        Loading...