Releases: getsavvyinc/savvy-cli
Releases · getsavvyinc/savvy-cli
0.21.1
0.21.0
Changelog
- 6b8cd27 README: reorder README and add savvy sync
- 5fcb5c2 client/llm: rm file
- eee25ba client: add llmClient that abstracts away the llm used to generate workflows
- 47cc768 client: refactor types to authz and llm pkgs
- f5e3bd0 config,llm: validate llm_base_url and inherit model name from config
- 9a695f3 config: support openai_base_url config value
- 43d27c3 custom_llm: impl Ask
- a4f4c10 enforce json output with structured outputs
- 113ccfc export: Artifact -> Workflow
- f471c6f llm,model,client: separate types for reuse
- a9decda llm: impl custom openAI endpoint
- 9b4f0ed refactor: move Ask api calls into llm pkg.
- 335ab9c support savvy explain with local llms
- efd0c37 tidy go.mod
0.20.1
0.20.0
Changelog
- efb70e5 add local flag to savvy run
- fe6be90 client: Add a local client.
- b58c262 client: Runbooks add an opt to fetch all runbooks or just runbooks owned by a user
- a2f403d cmd/run: use smaller client.RunbookClient instead of the full client.
- 3121233 storage: Write/Read runbooks to a local file
- 61a95e2 storage: serialize using gob instead of json
- 489f4a9 storage: store local copy at ~/.savvy