You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been searching through a lot of websites and watching youtube videos on how to deploy opensource LLM models locally on a windows server and then it could be further exposed to the users who can interact with the LLM to ask questions using their own laptop's web browser. I believe this could be acheived using openllm however, I am not sure if this is already included in the library.
Motivation
No response
Other
No response
The text was updated successfully, but these errors were encountered:
Have you find an way @sanket038 . Even i am in search of how to host the openllm from my working server and then making api calls from the server . any idea on hosting the openllm from the server . IF so please help me out.
do u know the steps to link my custom downloaded model to be linked with ollama and then serve as an api to everyone. where i have deployment an chatbot ui i need to have backend code as the api which can be accessed by entire members.like ui in multiple device piging the server like that. @euroblaze . If you have discord please let me know we can connect send me the invite link to this mail [email protected].
Feature request
I have been searching through a lot of websites and watching youtube videos on how to deploy opensource LLM models locally on a windows server and then it could be further exposed to the users who can interact with the LLM to ask questions using their own laptop's web browser. I believe this could be acheived using openllm however, I am not sure if this is already included in the library.
Motivation
No response
Other
No response
The text was updated successfully, but these errors were encountered: