Skip to content

Ollama local LLM step-by-step guide? #182

Answered by TerrenceMiao
robscurity asked this question in Q&A
Discussion options

You must be logged in to vote

This is my workaround approach:

$ ollama cp mixtral:instruct mixtral
copied 'mixtral:instruct' to 'mixtral'
$ ollama list
NAME                   	ID          	SIZE  	MODIFIED
mixtral:instruct       	7708c059a8bb	26 GB 	2 weeks ago  	
mixtral:latest         	7708c059a8bb	26 GB 	5 seconds ago	
$ echo "An idea that coding is like speaking with rules." | fabric --pattern write_essay --model mixtral

Fabric can automatically detect if Ollama runs on default port 11434. No need to add OPENAI_BASE_URL in configuration file ~/.config/fabric/.env.

Replies: 3 comments 4 replies

Comment options

You must be logged in to vote
4 replies
@robscurity
Comment options

@robscurity
Comment options

@xssdoctor
Comment options

@xssdoctor
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by robscurity
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants