We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tested with the default address and port http://127.0.0.1:11434/
llama3.1 llama3.1:latest
mimicking the oobabooga setup did not result in a connection either. https://docs.sillytavern.app/usage/api-connections/
https://youtu.be/SxRiRZu_Jhc?si=2-sG1iUQx0DLbxq4
kobold ai setup: https://www.youtube.com/watch?v=ksBWKa_30Hc
https://www.reddit.com/r/RisuAI/comments/1d57aki/does_anyone_know_how_to_connect_ollama_local_with/ not much help either tested and didnt work:
http://localhost:11434/api/chat http://localhost:11434/api/v1/generate http://localhost:11434/api/embed http://localhost:11434/api/tokenize ttp://localhost:11434/api/complete
http://127.0.0.1:11434/api/chat http://127.0.0.1:11434/api/v1/generate
adding a example for the default setup would help a lot of people: openai gpt anthropic claude d5837e5 custom (open ai) oobabooga mancer openrouter
RisuAI/src/ts/model/openrouter.ts
Line 9 in de6c90c
"sonnet 3.5 for aws and custom" 25a60db
fal.ai de6c90c
comfyui c6d96d9
remove tos: f7ddc09
The text was updated successfully, but these errors were encountered:
I've had some success using RisuAI and Ollama with these settings:
Custom OpenAI-compatible
URL: http://localhost:11434 Proxy Key/Password: ollama Request Model - Custom - fluffy/l3-8b-stheno-v3.2:latest Tokenizer: Llama3 Response Streaming: Enabled
You also need to import a preset for Context and Instruct which you can do within RisuAI.
L3 Instruct Mode.json L3 Context Template.json
Sorry, something went wrong.
No branches or pull requests
tested with the default address and port
http://127.0.0.1:11434/
llama3.1
llama3.1:latest
mimicking the oobabooga setup did not result in a connection either. https://docs.sillytavern.app/usage/api-connections/
https://youtu.be/SxRiRZu_Jhc?si=2-sG1iUQx0DLbxq4
kobold ai setup: https://www.youtube.com/watch?v=ksBWKa_30Hc
https://www.reddit.com/r/RisuAI/comments/1d57aki/does_anyone_know_how_to_connect_ollama_local_with/ not much help either
tested and didnt work:
http://localhost:11434/api/chat
http://localhost:11434/api/v1/generate
http://localhost:11434/api/embed
http://localhost:11434/api/tokenize
ttp://localhost:11434/api/complete
http://127.0.0.1:11434/api/chat
http://127.0.0.1:11434/api/v1/generate
adding a example for the default setup would help a lot of people:
openai gpt
anthropic claude d5837e5
custom (open ai)
oobabooga
mancer
openrouter
RisuAI/src/ts/model/openrouter.ts
Line 9 in de6c90c
mistral api
google gemini
kobold 90d1660
novellist
cohere
novel ai
horde
ollama https://github.com/search?q=repo%3Akwaroran%2FRisuAI+ollama&type=code
"sonnet 3.5 for aws and custom" 25a60db
fal.ai
de6c90c
comfyui c6d96d9
remove tos: f7ddc09
The text was updated successfully, but these errors were encountered: