Inference inside callback #19752
Closed
quentinblampey
started this conversation in
General
Replies: 1 comment
-
Using class PlotCallback(Callback):
def on_train_epoch_end(self, trainer: L.Trainer, model: Model) -> None:
loader = model.predict_dataloader()
for batch in loader:
batch = model.transfer_batch_to_device(batch, model.device, 0)
model.predict_step(batch)
... # save figure to wandb |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have a weird issue when running inference with a callback. I guess I'm not using pytorch lightning the intended way.
After each epoch, inside a callback, I run
trainer.predict
on a small dataset to plot some figures that I save with weight & biases. I thought it was pretty standard to do so, but I got the error below.I wrote this dummy minimal reproducible example:
Error log
Versions:
Beta Was this translation helpful? Give feedback.
All reactions