Object: ChatRequest
Object
question
chat_history
application
language
Category
Last updated
please show me how to log training output_name model of the current training run
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Can multiple users write to the same stream table?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
What are some great resources for me to better understand Weights & Biases usage for Deep Learning?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
How can weights & biases help when building llm-powered apps?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Can I get an enlarged/focused view for charts?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
is there an example of how to use Launch on Sagemaker using BYOI (Bring your own image)?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
how can I get the path to where an artifact is being written?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
does weights and balances have an integration with LangChain?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
I am logging some metrics using wandb.log method and I have runs grouped by Group. Within that group each run has the metrics logged. While analyzing the runs in the table format on the UI previously before some days it would show average metric of the runs within the group next to the group name, but it's showing basically nothing now. Has anything changed in wandb ?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
In the Runs view / Workspace, what button do we push to hide the sidebar with the list of all the run names?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
what are best practices for logging artifacts that optimally wont consume all of my storage.
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Can public cloud customers make W&B reports accessible to their colleagues without W&B access?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Can you introduce me wandb? I'm a beginner.
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Hello, I am trying to save the gradients as I would with WandbCallback, but with a custom callback:
Is this correct?
class GradientClipperCallback(tf.keras.callbacks.Callback):
def init(self, model):
self.model = model
def on_epoch_end(self, epoch, logs=None):
trainable_vars = self.model.trainable_variables
with tf.GradientTape() as tape:
y_pred = self.model(self.model.inputs, training=False)
gradients = tape.gradient(y_pred, trainable_vars)...
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
I have a question about sweeps. How can you constrain relationship between parameters. For example, I now that if `num_layers * hidden_dim` is large, I'll run out of GPU memory. So, also I would like to explore some hyperparameter space, there are some combination I know will fail. `optuna` as a way to do that: you can throw an special exception to cancel a run during a sweep, so that it is not recorded. Is there something similar in W&B, or another way of pruning unwanted combination of hyp...
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
I keep getting `Error in callback <bound method _WandbInit._pause_backend of <wandb.sdk.wandb_init._WandbInit object at 0x7fcee805af50>>`
```TypeError Traceback (most recent call last)
TypeError: _WandbInit._pause_backend() takes 1 positional argument but 2 were given
```
when calling `wandb.init()`. It was working ok earlier. I get this with versions 0.15.12 and 0.15.10.
How can I resolve this?
N/A
wandbot_v1.2-eval_8ec557e_chroma_v34
en
Total Rows: 199