You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for developing such a successful and practical educational app.
I am particularly concerned about privacy. I would like to know if it is possible to use smaller models (for example 70B or even 8B) through Ollama as the inference backend on local Docker containers to set up this tutor?
The text was updated successfully, but these errors were encountered:
I think this might require a writeup of some part of the code because as far as I understand, the website is directly interacting with the Together API. The Together API is OpenAI compatible and so is Ollama. Maybe you can try to change API calls to the ollama openai endpoint.
Thank you very much for developing such a successful and practical educational app.
I am particularly concerned about privacy. I would like to know if it is possible to use smaller models (for example 70B or even 8B) through Ollama as the inference backend on local Docker containers to set up this tutor?
The text was updated successfully, but these errors were encountered: