An example JavaScript app demonstrating how to integrate Pangea's AI Guard service into a LangChain app to monitor and sanitize LLM generations.
- Node.js v22.
- A Pangea account with AI Guard enabled.
- An OpenAI API key.
git clone https://github.com/pangeacyber/langchain-js-aig-response-tracing.git
cd langchain-js-aig-response-tracing
npm install
cp .env.example .env
Fill in the values in .env
and then the app can be run like so:
npm run demo -- "A prompt would go here."
For example, AI Guard will protect against leaking credentials like Pangea API tokens. The easiest way to demonstrate this would be to have the LLM repeat a given (fake) API token:
npm run demo -- "Echo 'pts_testtesttesttesttesttesttesttest' back."
The output after AI Guard is:
************************************