How to collect batched predictions? #9379
-
Hello :) i.e. each predict_step() returns a tensor of shape batch_size x 10 This is just a simple example to illustrate my problem. My actual return per prediction step is a little bit more involved and I would like to clean up the structure before returning it and also make this cleanup logic part of the module so that I dont have to remember the specifics. Is there a way to do that in the current framework and if so how? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Issue to track #9380 |
Beta Was this translation helpful? Give feedback.
Issue to track #9380