Skip to content

Commit

Permalink
llm-claude-3 is now called llm-anthropic
Browse files Browse the repository at this point in the history
Refs simonw/llm-claude-3#31

!stable-docs
  • Loading branch information
simonw committed Feb 2, 2025
1 parent deb8bc3 commit 21df241
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 5 deletions.
3 changes: 1 addition & 2 deletions docs/plugins/directory.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,7 @@ These plugins can be used to interact with remotely hosted models via their API:

- **[llm-mistral](https://github.com/simonw/llm-mistral)** adds support for [Mistral AI](https://mistral.ai/)'s language and embedding models.
- **[llm-gemini](https://github.com/simonw/llm-gemini)** adds support for Google's [Gemini](https://ai.google.dev/docs) models.
- **[llm-claude](https://github.com/tomviner/llm-claude)** by Tom Viner adds support for Claude 2.1 and Claude Instant 2.1 by Anthropic.
- **[llm-claude-3](https://github.com/simonw/llm-claude-3)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family) of models.
- **[llm-anthropic](https://github.com/simonw/llm-anthropic)** supports Anthropic's [Claude 3 family](https://www.anthropic.com/news/claude-3-family), [3.5 Sonnet](https://www.anthropic.com/news/claude-3-5-sonnet) and beyond.
- **[llm-command-r](https://github.com/simonw/llm-command-r)** supports Cohere's Command R and [Command R Plus](https://txt.cohere.com/command-r-plus-microsoft-azure/) API models.
- **[llm-reka](https://github.com/simonw/llm-reka)** supports the [Reka](https://www.reka.ai/) family of models via their API.
- **[llm-perplexity](https://github.com/hex/llm-perplexity)** by Alexandru Geana supports the [Perplexity Labs](https://docs.perplexity.ai/) API models, including `llama-3-sonar-large-32k-online` which can search for things online and `llama-3-70b-instruct`.
Expand Down
4 changes: 2 additions & 2 deletions docs/python-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,10 +94,10 @@ print(model.prompt("Names for otters", temperature=0.2))

### Models from plugins

Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-claude-3](https://github.com/simonw/llm-claude-3):
Any models you have installed as plugins will also be available through this mechanism, for example to use Anthropic's Claude 3.5 Sonnet model with [llm-anthropic](https://github.com/simonw/llm-anthropic):

```bash
pip install llm-claude-3
pip install llm-anthropic
```
Then in your Python code:
```python
Expand Down
2 changes: 1 addition & 1 deletion docs/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ This will install and run LLM using a temporary virtual environment.
You can use the `--with` option to add extra plugins. To use Anthropic's models, for example:
```bash
export ANTHROPIC_API_KEY='...'
uvx --with llm-claude-3 llm -m claude-3.5-haiku 'fun facts about skunks'
uvx --with llm-anthropic llm -m claude-3.5-haiku 'fun facts about skunks'
```
All of the usual LLM commands will work with `uvx llm`. Here's how to set your OpenAI key without needing an environment variable for example:
```bash
Expand Down

0 comments on commit 21df241

Please sign in to comment.