Skip to content

Commit

Permalink
pnpm format
Browse files Browse the repository at this point in the history
  • Loading branch information
hunterheston committed Nov 11, 2024
1 parent bcb4e4a commit 1ae7e98
Showing 1 changed file with 14 additions and 9 deletions.
23 changes: 14 additions & 9 deletions js/plugins/checks/README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,14 @@
# Checks

Checks is an AI safety platform built by Google: [checks.google.com/ai-safety](https://checks.google.com/ai-safety).

This plugin provides evaluators for each Checks AI safety policy. Text is cassified by calling the [Checks Guardrails API](https://console.cloud.google.com/marketplace/product/google/checks.googleapis.com).
This plugin provides evaluators for each Checks AI safety policy. Text is cassified by calling the [Checks Guardrails API](https://console.cloud.google.com/marketplace/product/google/checks.googleapis.com).

> Note: The Guardrails is currently in private priview and you will need to request quota. See Guardrails documentation: [developers.devsite.corp.google.com/checks/guide/api/ai-safety](https://developers.devsite.corp.google.com/checks/guide/api/ai-safety?db=sherzat)
Curently that list includes:
``` text

```text
DANGEROUS_CONTENT
PII_SOLICITING_RECITING
HARASSMENT
Expand All @@ -20,15 +22,16 @@ OBSCENITY_AND_PROFANITY
## How to use

### Configure the plugin

Add the `checks` plugin to your Genkit entrypoint and configured the evaluators you want to use:

```ts
import { checks, ChecksEvaluationMetricType } from '@genkit-ai/checks';


export const ai = genkit({
plugins: [
checks({
// Project to charge quota to.
// Project to charge quota to.
// Note: If your credentials have a quota project associated with them.
// That value will take precedence over this.
projectId: 'your-project-id',
Expand Down Expand Up @@ -58,11 +61,12 @@ export const ai = genkit({
}),
],
});

```

### Create a test dataset
Create a JSON file with the data you want to test. Add as many test cases as you want. `output` is the text that will be classified.

Create a JSON file with the data you want to test. Add as many test cases as you want. `output` is the text that will be classified.

```JSON
// test-dataset.json
[
Expand All @@ -76,19 +80,20 @@ Create a JSON file with the data you want to test. Add as many test cases as you
```

### Run the evaluators

```bash
# Run just the DANGEROUS_CONTENT classifier.
# Run just the DANGEROUS_CONTENT classifier.
genkit eval:run test-dataset.json --evaluators=checks/dangerous_content
```

```bash
# Run all classifiers.
# Run all classifiers.
genkit eval:run test-dataset.json --evaluators=checks/dangerous_content,checks/pii_soliciting_reciting,checks/harassment,checks/sexually_explicit,checks/hate_speech,checks/medical_info,checks/violence_and_gore,checks/obscenity_and_profanity
```

### View the results
Run `genkit start` and open the genkit ui. Usually at `localhost:4000` and select the Evaluate tab.

Run `genkit start` and open the genkit ui. Usually at `localhost:4000` and select the Evaluate tab.

# Genkit

Expand Down

0 comments on commit 1ae7e98

Please sign in to comment.