Multiple Embedding-Models within one server instance #92
Closed
ChristophRaab
started this conversation in
Ideas
Replies: 1 comment
-
Hello! No, that's not something that is possible/planned. The reason we do not plan to add such a feature is because we feel that other parts of the stack already cover it (for example MIG). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
i want to host multiple embeddings models, serving different purposes (fine tuned on mutual exclusive data). I was wondering, although i have not found anything in the docs, if it is possible to run multiple embedding models at the same time within one inference server. Currently, one would start an instance of an inference server for every model right?
Is something possible or planned?
Thanks in advance!
Best
Chris
Beta Was this translation helpful? Give feedback.
All reactions