This repository contains a step-by-step guide for setting up tracing for a chatbot using Arize Phoenix, an open-source LLM observability solution that you can self-host in your own environment and use it for auto-instrumentation of traces. The concepts in this respository are applicable to any situation where you want to setup LLM observerability. However, note that the configuration we used for resources in this post, such as Amazon Elastic Load Balancer (Amazon ELB), Amazon Elastic Container Registry (Amazon ECR), etc., are not suitable for production use as-is. You would need a thorough security review if you plan to take the concepts to your production environment.
Clone the git repository into a folder. For example:
git clone https://github.com/seanlee10/llm-observability-with-arize-phoenix
cd /gradio
docker build -t phoenix-demo-gradio .
cd /infra
cdk deploy
cdk destroy
This library is licensed under the MIT-0 License. See the LICENSE file.