Inference batch size #989
inakierregueab
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As far as I could understand the code, if N slices are generated, the inference is performed N times on batch 1. Are you going to implemente the customisation of this batch size? Ideally a single forward pass with batch size being N, the number of slices. That would speed up inference!:) ty
Beta Was this translation helpful? Give feedback.
All reactions