Skip to content

Commit

Permalink
feat: Auto-download Ollama models (#98)
Browse files Browse the repository at this point in the history
kaspermarstal authored Jan 30, 2025

Verified

This commit was signed with the committer’s verified signature. The key has expired.
zakkak Foivos Zakkak
1 parent 1c5c4d5 commit c5c4b12
Showing 2 changed files with 30 additions and 18 deletions.
25 changes: 9 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -30,7 +30,7 @@ In this example, we copy the papers' titles and abstracts into Excel and write t
We then use autofill to apply the prompt to many papers. Simple and powerful.

Note that a paper is misclassified. The models _will_ make mistakes. It is your responsibility to cross validate if a model is accurate enough for your use case and upgrade model or use another approach if not.
Green denotes a correct classification and read denotes and incorrect classification. The models _will_ make mistakes at times and it is your responsibility to cross-validate if a model is accurate enough for your use case and upgrade model or use another approach if not.

## Getting Started

@@ -64,14 +64,9 @@ Cellm must be built from source and installed via Excel. Follow the steps below.
dotnet build --configuration Release
```

5. Cellm uses Ollama and the Gemma 2 2B model by default. Download and install [Ollama](https://ollama.com/) and run the following command in your Windows terminal to download the model:
```cmd
ollama pull gemma2:2b
```
To use other models, see the [Models](#models) section below.
5. Download and install [Ollama](https://ollama.com/). Cellm uses Ollama and the Gemma 2 2B model by default. Gemma 2 2B will be downloaded automatically the first time you call `=PROMPT()`. To call other models, see the [Models](#models) section below.

These steps will build Cellm on your computer. Continue with the steps below to install Cellm in Excel.
The above steps will build Cellm on your computer. Continue with the steps below to install Cellm in Excel.

### Install

@@ -92,25 +87,23 @@ In this example we use openai/gpt-4o-mini to list PDF files in a folder.
=PROMPT(A1, "Which pdf files do I have in my downloads folder?")
```

To configure which AI model processes your prompts, use the Cellm tab in Excel's ribbon menu:
To configure which AI model you call, use the Cellm tab in Excel's ribbon menu:

- **Model**: Select which AI model to use (e.g., "openai/gpt-4o-mini")
- **Address**: The API endpoint for your chosen provider (e.g., "https://api.openai.com/v1")
- **API Key**: Your authentication key for the selected provider

Additional tools in the Cellm tab:
The other options in the Cellm tab are:
- **Cache**: Enable/disable local caching of model responses. Useful when Excel triggers recalculation of many cells.
- **Functions**: Enable/disable tools.
See the Functions section below for detailed syntax and examples.
- **Functions**: Enable/disable tools (not to be confused with Excel _formula_ functions below).

### Functions
Cellm provides the following functions that can be used in Excel formulas:

#### PROMPT

```excel
PROMPT(cells: range, [instruction: range | instruction: string | temperature: double], [temperature: double]): string
PROMPT(cells: range | instruction: string, [instruction: range | instruction: string | temperature: double], [temperature: double]): string
```

- **cells (Required):** A cell or a range of cells.
@@ -133,7 +126,7 @@ Example usage:
#### PROMPTWITH

```excel
PROMPTWITH(providerAndModel: string or cell, cells: range, [instruction: range | instruction: string | temperature: double], [temperature: double]): string
PROMPTWITH(providerAndModel: string or cell, cells: range | instruction: string, [instruction: range | instruction: string | temperature: double], [temperature: double]): string
```

Allows you to specify the model as the first argument.
@@ -198,7 +191,7 @@ The following sections shows you how to configure `appsettings.Local.json` for c
### Hosted LLMs
Cellm supports hosted models from Anthropic, Google, OpenAI, and any OpenAI-compatible provider. To use e.g. Claude 3.5 Sonnet from Anthropic:
Cellm supports hosted models from Anthropic, DeepSeek, Google, OpenAI, Mistral, and any OpenAI-compatible provider. To use e.g. Claude 3.5 Sonnet from Anthropic:
1. Rename `src/Cellm/appsettings.Anthropic.json` to `src/Cellm/appsettings.Local.json`.
23 changes: 21 additions & 2 deletions src/Cellm/Models/Providers/Ollama/OllamaRequestHandler.cs
Original file line number Diff line number Diff line change
@@ -1,14 +1,33 @@
using Cellm.Models.Prompts;
using System.Text;
using System.Text.Json;
using Cellm.Models.Prompts;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Options;

namespace Cellm.Models.Providers.Ollama;

internal class OllamaRequestHandler(
[FromKeyedServices(Provider.Ollama)] IChatClient chatClient) : IModelRequestHandler<OllamaRequest, OllamaResponse>
IOptionsMonitor<OllamaConfiguration> ollamaConfiguration,
[FromKeyedServices(Provider.Ollama)] IChatClient chatClient,
HttpClient httpClient) : IModelRequestHandler<OllamaRequest, OllamaResponse>
{
public async Task<OllamaResponse> Handle(OllamaRequest request, CancellationToken cancellationToken)
{
// Pull model if it doesn't exist
var json = await httpClient.GetStringAsync(new Uri(ollamaConfiguration.CurrentValue.BaseAddress, "api/tags"), cancellationToken);

if (!JsonDocument.Parse(json).RootElement
.GetProperty("models")
.EnumerateArray()
.Select(model => model.GetProperty("name").GetString())
.Contains(request.Prompt.Options.ModelId))
{
var body = new StringContent($"{{\"model\": \"{request.Prompt.Options.ModelId}\", \"stream\": false}}", Encoding.UTF8, "application/json");
var response = await httpClient.PostAsync(new Uri(ollamaConfiguration.CurrentValue.BaseAddress, "api/pull"), body, cancellationToken);
response.EnsureSuccessStatusCode();
}

var chatCompletion = await chatClient.CompleteAsync(
request.Prompt.Messages,
request.Prompt.Options,

0 comments on commit c5c4b12

Please sign in to comment.