Skip to content

Commit

Permalink
Adjusted readme
Browse files Browse the repository at this point in the history
  • Loading branch information
mattpocock committed Dec 3, 2024
1 parent 95d141f commit 2e0aac7
Show file tree
Hide file tree
Showing 3 changed files with 34 additions and 0 deletions.
6 changes: 6 additions & 0 deletions .husky/pre-commit
Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
FILE="readme.md"
if git diff --cached --name-only | grep -Fx "$FILE" > /dev/null; then
echo "Error: $FILE has been modified. Please move your changes to packages/evalite/readme.md instead."
exit 1
fi

cp packages/evalite/readme.md readme.md
git add readme.md
14 changes: 14 additions & 0 deletions packages/evalite/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,20 @@ Open http://localhost:3006 in your browser to view the results of the eval.

## Guides

### Watch Mode

You can run Evalite in watch mode by running `evalite watch`:

```bash
evalite watch
```

This will watch for changes to your `.eval.ts` files and re-run the evals when they change.

> [!IMPORTANT]
>
> I strongly recommend implementing a caching layer in your LLM calls. This will keep your tests running fast and avoid burning through your API credits.
### Environment Variables

To call your LLM from a third-party service, you'll likely need some environment variables to keep your API keys safe.
Expand Down
14 changes: 14 additions & 0 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,20 @@ Open http://localhost:3006 in your browser to view the results of the eval.

## Guides

### Watch Mode

You can run Evalite in watch mode by running `evalite watch`:

```bash
evalite watch
```

This will watch for changes to your `.eval.ts` files and re-run the evals when they change.

> [!IMPORTANT]
>
> I strongly recommend implementing a caching layer in your LLM calls. This will keep your tests running fast and avoid burning through your API credits.
### Environment Variables

To call your LLM from a third-party service, you'll likely need some environment variables to keep your API keys safe.
Expand Down

0 comments on commit 2e0aac7

Please sign in to comment.