You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the Generic OpenAI Embedder doesn't offer a way to specify a maximum number of concurrent chunks for embedding and always defaults to 500. This limits its usability with OpenAI compatible embedders that may have limitations on chunk batch size. Currently trying to use an embedder that has a batch size limitation of 32 and any documents with chunks greater than that result in a 413 error.
Request:
Add an optional maxConcurrentChunks parameter to the Generic OpenAI Embedder UI, allowing users to control the maximum number of chunks processed concurrently. This will allow embedders that enforce concurrent chunk size restrictions to work.
The text was updated successfully, but these errors were encountered:
What would you like to see?
Description:
Currently, the Generic OpenAI Embedder doesn't offer a way to specify a maximum number of concurrent chunks for embedding and always defaults to 500. This limits its usability with OpenAI compatible embedders that may have limitations on chunk batch size. Currently trying to use an embedder that has a batch size limitation of 32 and any documents with chunks greater than that result in a 413 error.
Request:
Add an optional maxConcurrentChunks parameter to the Generic OpenAI Embedder UI, allowing users to control the maximum number of chunks processed concurrently. This will allow embedders that enforce concurrent chunk size restrictions to work.
The text was updated successfully, but these errors were encountered: