Batch Inference #749
habibaezz01
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does the model support batches inference ? because i thing when i gave a very long input, it affects the quality of the output because of the context window. when i tried to increase the context window i got very cold and emotionless output.
Beta Was this translation helpful? Give feedback.
All reactions