Skip to content

An example JavaScript app demonstrating how to integrate Pangea's AI Guard service into a LangChain app to monitor and sanitize LLM generations.

License

Notifications You must be signed in to change notification settings

pangeacyber/langchain-js-aig-response-tracing

Repository files navigation

AI Guard Response Tracing for LangChain in JavaScript

An example JavaScript app demonstrating how to integrate Pangea's AI Guard service into a LangChain app to monitor and sanitize LLM generations.

Prerequisites

Setup

git clone https://github.com/pangeacyber/langchain-js-aig-response-tracing.git
cd langchain-js-aig-response-tracing
npm install
cp .env.example .env

Fill in the values in .env and then the app can be run like so:

npm run demo -- "A prompt would go here."

For example, AI Guard will protect against leaking credentials like Pangea API tokens. The easiest way to demonstrate this would be to have the LLM repeat a given (fake) API token:

npm run demo -- "Echo 'pts_testtesttesttesttesttesttesttest' back."

The output after AI Guard is:

************************************

About

An example JavaScript app demonstrating how to integrate Pangea's AI Guard service into a LangChain app to monitor and sanitize LLM generations.

Resources

License

Security policy

Stars

Watchers

Forks