Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT]: Add support for specifying maxConcurrentChunks for Generic OpenAI Embedder #2654

Open
hdelossantos opened this issue Nov 20, 2024 · 0 comments · May be fixed by #2655
Open

[FEAT]: Add support for specifying maxConcurrentChunks for Generic OpenAI Embedder #2654

hdelossantos opened this issue Nov 20, 2024 · 0 comments · May be fixed by #2655
Labels
enhancement New feature or request feature request

Comments

@hdelossantos
Copy link

What would you like to see?

Description:

Currently, the Generic OpenAI Embedder doesn't offer a way to specify a maximum number of concurrent chunks for embedding and always defaults to 500. This limits its usability with OpenAI compatible embedders that may have limitations on chunk batch size. Currently trying to use an embedder that has a batch size limitation of 32 and any documents with chunks greater than that result in a 413 error.

Request:

Add an optional maxConcurrentChunks parameter to the Generic OpenAI Embedder UI, allowing users to control the maximum number of chunks processed concurrently. This will allow embedders that enforce concurrent chunk size restrictions to work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant