Batu's group workspace
JumpMLP_Observations
What makes this group special?
Tags
observation_specification:{'vector': slice(0, 19, None)}
Notes
Author
State
Finished
Start time
December 16th, 2021 11:38:56 AM
Runtime
2m 48s
Tracked hours
2m 42s
Run path
batu/BehaviorCloning/kppa124o
OS
Windows-10-10.0.19041-SP0
Python version
3.7.10
Git repository
git clone https://github.com/batu/RLAgency.git
Git state
git checkout -b "observation_specification{'vector'-slice(0,-19,-None)}" 6e4e3bb5250b23f7cc217320753f7a5bca39a6ab
Command
<python with no main file> --ip=127.0.0.1 --stdin=9008 --control=9006 --hb=9005 --Session.signature_scheme="hmac-sha256" --Session.key=b"3788daf4-3215-4e59-b4dd-8c9489ac9561" --shell=9007 --transport="tcp" --iopub=9009 --f=C:\Users\batua\AppData\Local\Temp\tmp-16980dcSENUV03Q17.json
System Hardware
| CPU count | 16 |
| GPU count | 1 |
| GPU type | NVIDIA GeForce RTX 3070 Laptop GPU |
W&B CLI Version
0.12.6
Group
JumpMLP_ObservationsConfig
Config parameters are your model's inputs. Learn more
- {} 16 keys▶
- "torch.nn.functional.relu"
- "BaselineNoGraph( (dropout): Dropout(p=0, inplace=False) (layers): ModuleList( (0): Linear(in_features=19, out_features=512, bias=True) (1): Linear(in_features=512, out_features=512, bias=True) (2): Linear(in_features=512, out_features=3, bias=True) ) (input_layer): Linear(in_features=19, out_features=512, bias=True) (hidden_layers): ModuleList( (0): Linear(in_features=512, out_features=512, bias=True) ) (output_layer): Linear(in_features=512, out_features=3, bias=True) )"
- 128
- 0
- "Jump"
- 512
- 19
- 0
- 0.001
- "torch.nn.functional.mse_loss"
- 50
- 3
- "pos"
- {} 1 key▶
- {} 3 keys▶
- 0
- null
- 19
- 2,106
- "InputSweep"
Summary
Summary metrics are your model's outputs. Learn more
- {} 20 keys▶
- 49
- {} 7 keys▶
- 0.00010685406596167012
- {} 7 keys▶
- 0.62
- 0.17514070868492126
- 0.07913825660943985
- 0.07183842360973358
- 8,799
- 0.33316442370414734
- 0.24756859242916107
- 0.23375990986824036
- 0.33011963963508606
- 0.4207627177238464
- 0.3651314973831177
- 0.2905857563018799
- 0.2992700934410095
- 0.2915428876876831
- 0.27032607793807983
- 0.21508283913135529
Artifact Outputs
This run produced these artifacts as outputs. Total: 1. Learn more
Type
Name
Consumer count
Loading...