Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: How can we deploy FastEmbed externally as an Inference service? #395

Open
S1LV3RJ1NX opened this issue Nov 10, 2024 · 1 comment

Comments

@S1LV3RJ1NX
Copy link

What feature would you like to request?

I would like to deploy fastembed as an external service, similar to infinity. Can we do that?

Is there any additional information you would like to provide?

No response

@joein
Copy link
Member

joein commented Nov 12, 2024

Hey @S1LV3RJ1NX
Fastembed is a library, not an inference server, in order to do that you'd need to write some code, e.g. fastapi backend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants