Skip to content

Commit

Permalink
Merge branch 'main' into dev-1614
Browse files Browse the repository at this point in the history
  • Loading branch information
MarcoPonchia authored Dec 19, 2024
2 parents 3808be8 + 2345af9 commit f3dd9a4
Show file tree
Hide file tree
Showing 57 changed files with 1,915 additions and 1,463 deletions.
5 changes: 5 additions & 0 deletions .changeset/gorgeous-ghosts-float.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"infrastructure": patch
---

Use the ddb stream event id as message group id in sqs
6 changes: 6 additions & 0 deletions .changeset/nice-coins-mix.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
"infrastructure": minor
"chatbot": minor
---

Implemented llm monitoring with LangFuse
2 changes: 1 addition & 1 deletion .gitmodules
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
[submodule "apps/nextjs-website/.tmp-docs"]
path = apps/nextjs-website/.tmp-docs
url = https://github.com/pagopa/devportal-docs.git
url = https://github.com/pagopa/devportal-docs.git
1 change: 1 addition & 0 deletions apps/chatbot/.dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,4 @@
build-devp
.pytest_cache
load-test
*.ipynb
75 changes: 55 additions & 20 deletions apps/chatbot/.env.example
Original file line number Diff line number Diff line change
@@ -1,31 +1,66 @@
environment=local
CORS_DOMAINS=["*"]
PYTHONPATH=app-path
LOG_LEVEL=DEBUG
AUTH_COGNITO_ALLOW_ACCOUNT_LINKING=true
AUTH_COGNITO_CLIENT_ID=...
AUTH_COGNITO_CLIENT_SECRET=
AUTH_COGNITO_ISSUER=https://cognito-idp.eu-south-1.amazonaws.com/eu-south-1_xxxxxxxx
AUTH_DISABLE_SIGNUP=false
AUTH_DISABLE_USERNAME_PASSWORD=true
CHB_AWS_ACCESS_KEY_ID=...
CHB_AWS_SECRET_ACCESS_KEY=...
CHB_AWS_BEDROCK_EMBED_REGION=eu-central-1
CHB_AWS_BEDROCK_LLM_REGION=eu-west-3
CHB_AWS_DEFAULT_REGION=eu-south-1
CHB_AWS_BEDROCK_REGION=eu-west-3
CHB_AWS_GUARDRAIL_ID=...
CHB_AWS_GUARDRAIL_VERSION=...
CHB_REDIS_URL=...
CHB_WEBSITE_URL=...
CHB_REDIS_INDEX_NAME=...
CHB_LLAMAINDEX_INDEX_ID=...
CHB_AWS_SECRET_ACCESS_KEY=...
CHB_DOCUMENTATION_DIR=...
CHB_USE_PRESIDIO=...
CHB_GOOGLE_API_KEY=...
CHB_PROVIDER=...
CHB_MODEL_ID=...
CHB_MODEL_TEMPERATURE=...
CHB_MODEL_MAXTOKENS=...
CHB_DYNAMODB_URL=http://locahost:8080
CHB_EMBED_MODEL_ID=...
CHB_ENGINE_SIMILARITY_TOPK=...
CHB_ENGINE_SIMILARITY_CUTOFF=...
CHB_ENGINE_SIMILARITY_TOPK=...
CHB_ENGINE_USE_ASYNC=True
CHB_ENGINE_USE_STREAMING=...
CHB_ENGINE_USE_CHAT_ENGINE=...
CHB_GOOGLE_API_KEY=...
CHB_LANGFUSE_HOST=http://localhost:3000
CHB_LANGFUSE_PUBLIC_KEY=/nonexistent/ssmpath
CHB_LANGFUSE_SECRET_KEY=/nonexistent/ssmpath
CHB_LLAMAINDEX_INDEX_ID=...
CHB_MODEL_ID=...
CHB_MODEL_MAXTOKENS=...
CHB_MODEL_TEMPERATURE=...
CHB_PROVIDER=...
CHB_QUERY_TABLE_PREFIX=chatbot-local
CHB_DYNAMODB_URL=http://locahost:8080
CHB_USE_PRESIDIO=True
CHB_REDIS_INDEX_NAME=...
CHB_REDIS_URL=...
CHB_SESSION_MAX_DURATION_DAYS=1
CHB_USE_CHAT_ENGINE=True
CHB_USE_PRESIDIO=True
CHB_WEBSITE_URL=...
CORS_DOMAINS=["*"]
ENCRYPTION_KEY=0000000000000000000000000000000000000000000000000000000000000000
environment=local
LAMBDA_TASK_ROOT=app-dir
LANGFUSE_ENABLE_EXPERIMENTAL_FEATURES=false
LANGFUSE_INIT_ORG_ID=...
LANGFUSE_INIT_ORG_NAME=...
LANGFUSE_INIT_PROJECT_ID=monitor-123
LANGFUSE_INIT_PROJECT_NAME=Monitor
LANGFUSE_INIT_PROJECT_PUBLIC_KEY=pk-xxx
LANGFUSE_INIT_PROJECT_SECRET_KEY=sk-xxx
LANGFUSE_INIT_USER_EMAIL=...
LANGFUSE_INIT_USER_NAME=...
LANGFUSE_INIT_USER_PASSWORD=...
LANGFUSE_TAG=development
LANGFUSE_USER_EMAIL=[email protected]
LANGFUSE_USER_NAME=User
LANGFUSE_USER_PASSWORD=...
LOG_LEVEL=DEBUG
NEXTAUTH_SECRET=mysecret
NEXTAUTH_URL=http://localhost:3001
POSTGRES_DB=postgres
POSTGRES_HOST=localhost
POSTGRES_PASSWORD=postgres
POSTGRES_PORT=5432
POSTGRES_USER=postgres
DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
PYTHONPATH=app-path
SALT=mysalt
TELEMETRY_ENABLED=true
44 changes: 32 additions & 12 deletions apps/chatbot/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,16 @@ Even though the provider is the Google one, we stored its API key in AWS. So, be

The Retrieval-Augmented Generation (RAG) was implemented using [llama-index](https://docs.llamaindex.ai/en/stable/). All the parameters and prompts used are stored in `config`.

The monitoring is done using [Langfuse](https://langfuse.com/) deployed on AWS.

## Environment Variables

Create a `.env` file inside this folder and store the environment variables listed in `.env.example`.

cp .env.example .env

Note that the envirables inside `.env` file should be pointing to the AWS infrastructure.s

## Virtual environment

Before creating your virtual environment, install [Miniconda](https://docs.anaconda.com/miniconda/#quick-command-line-install) and [Poetry](https://python-poetry.org/docs/main#installation) on your device.
Expand All @@ -36,30 +42,44 @@ In this way, `PYTHONPATH` points to where the Python packages and modules are, n

To reach the remote redis instance, it is necessary to open a tunnel:

```
./scripts/redis-tunnel.sh
```

Verify that the HTML files that compose the Developer Portal documentation exist in a directory. Otherwise create the documentation. Once you have the documentation directory ready, put its path in `params` and, in the end, create the vector index doing:

```
python src/modules/create_vector_index.py --params config/params.yaml
```

This script reads the documentation, split it into chucks with gerarchical organization, and stores it on Redis.

Check out the params in order to store your vector index accordingly.

## test
## Test

In order to test the chatbot and its APIs, run:

pytest

For more details, read [TESTBOOK.md](https://github.com/pagopa/developer-portal/blob/main/apps/chatbot/TESTBOOK.md).

## Docker

In order to run the chatbot locally for the first time, you need to:

- install [Docker Compose](https://docs.docker.com/compose/install/),
- create `.env.local` file running:

cp .env.example .env.local

and fill it in,

- run the following bash scripts:

```
pytest
```
./docker/docker-compose-build-local.sh
./docker/docker-compose-run-create_index.sh

## Web App
In this way, the docker images are built and the vector index is stored in Redis.

python src/webapp/app.py
Now you can start the API running:

This scripts uses [Gradio](https://www.gradio.app/) framework to lunch a web application at the [default link](http://127.0.0.1:7860) where the user can interact with the chatbot.
./docker/docker-compose-up-api.sh

Both [`user icon`](https://www.flaticon.com/free-icon/user_1077012) and [`chatbot icon`](https://www.flaticon.com/free-icon/chatbot_8943377) are made by [Freepick](https://www.freepik.com/) and they were downloaded from [Flaticon](https://www.flaticon.com/).
Note that the `docker/compose.yaml` needs `.env.local` file with the correct environment variables.
17 changes: 17 additions & 0 deletions apps/chatbot/TESTBOOK.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Testbook

In order to test the chatbot functions and its APIs, run:

pytest

the command test the function explained in the table below.

| Function | Requirements | Masked Inputs | Description |
| :------------------------------: | :-----------------------: | :---------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
| `test_connection_redis()` | Redis client | None | Check the connection with Redis is up |
| `test_connection_langfuse()` | Langfuse client | None | Check the connection with Langfuse is up |
| `test_cloud_connection()` | AWS or Gemini credentials | LLM and Embedding model ID | Check the models' loading |
| `test_prompt_templates()` | Llama-index | `prompts.yaml` | Check the prompts have the same variables of the prompts templates |
| `test_pii_mask()` | Presidio | a string to mask | Check that Presidio works as expected |
| `test_messages_to_chathistory()` | Llama-index | chat history from the local storage | Check the write functioning of creating a chat history in Llama-index |
| `test_chat_generation()` | Llama-index | two queries | Check the chatbot generation given a query, then checks again its functioning generating a new answer usinng a second query and the previous interaction as chat history |
8 changes: 7 additions & 1 deletion apps/chatbot/config/params.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,8 @@ config_presidio:
allow_list:
- Discovery
- discovery
- rispondo
- Rispondo
- Rif
- SEND
- send
Expand All @@ -61,8 +63,12 @@ config_presidio:
- Gpd
- STATO
- stato
- PagoPA
- PagoPA
- pagoPA
- Devportal
- devPortal
- DevPortal
- devportal
- pagopa
- Pagopa
- Firma con IO
Expand Down
24 changes: 13 additions & 11 deletions apps/chatbot/config/prompts.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,12 +36,14 @@ qa_prompt_str: |
Task:
Given the query: {query_str}
Reply according to the `Chatbot Policy` listed above.
If the query is a thank, transform it into a polite and contextually appropriate answer.
Answer:
refine_prompt_str: |
Given the original answer: {existing_answer}, we have the opportunity to refine it (only if needed) with some more context here below:
Given the original answer: {existing_answer},
we have the opportunity to refine it (only if needed) with some more context here below:
--------------------
{context_msg}
--------------------
Expand All @@ -53,15 +55,15 @@ refine_prompt_str: |
Answer:
condense_prompt_str: |
Given a conversation (between Human and Assistant) and a follow up message from Human, rewrite the message to be a standalone question that captures all relevant context from the conversation.
The standalone question must be in Italian.
<Chat History>
Given the following chat history between a user and an AI assistant:
{chat_history}
<Follow Up Message>
--------------------
and a follow up question from user:
{question}
<Standalone question>
--------------------
Task:
Rephrase the follow up question to be a standalone question.
If the follow up question is a thank, transform it into a polite and contextually appropriate standalone response.
The standalone question or response must be in Italian.
Standalone question or response:
10 changes: 6 additions & 4 deletions apps/chatbot/docker/app.local.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,13 @@ RUN pip install --upgrade pip \
&& pip install poetry awscli

WORKDIR /app
COPY pyproject.toml .
COPY poetry.lock .
COPY ./pyproject.toml .
COPY ./poetry.lock .
COPY ./src ./src
COPY ./config ./config
COPY ./scripts ./scripts

RUN poetry config virtualenvs.create false
RUN poetry install

COPY . .

CMD ["fastapi", "dev", "src/app/main.py", "--port", "8080", "--host", "0.0.0.0"]
8 changes: 6 additions & 2 deletions apps/chatbot/docker/compose.test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,17 @@ services:
- AWS_SECRET_ACCESS_KEY=dummy
- AWS_DEFAULT_REGION=local
healthcheck:
test: ["CMD-SHELL", '[ "$(curl -s -o /dev/null -I -w ''%{http_code}'' http://localhost:8000)" == "400" ]']
test:
[
"CMD-SHELL",
'[ "$(curl -s -o /dev/null -I -w ''%{http_code}'' http://localhost:8000)" == "400" ]',
]
interval: 10s
timeout: 10s
retries: 10
networks:
- ntw

redis:
image: redis/redis-stack:7.2.0-v13
networks:
Expand Down
43 changes: 41 additions & 2 deletions apps/chatbot/docker/compose.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
---
services:
api:
build:
context: ..
dockerfile: docker/app.local.Dockerfile
env_file: ../.env.local
command: "./scripts/run.local.sh"
ports:
- "8080:8080"
Expand All @@ -16,6 +16,25 @@ services:
condition: service_started
dynamodb:
condition: service_started
langfuse:
condition: service_started
env_file: ../.env.local
networks:
- ntw

postgres:
image: postgres:17.2-alpine
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 3s
timeout: 3s
retries: 10
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
env_file: ../.env.local
networks:
- ntw

Expand All @@ -35,25 +54,45 @@ services:
ports:
- "6379:6379"
- "8001:8001"
volumes:
- redis_data:/data
networks:
- ntw

create_index:
build:
context: ..
dockerfile: docker/app.local.Dockerfile
command: "python src/modules/create_vector_index.py --params config/params.yaml"
ports:
- "8080:8080"
volumes:
- ..:/app
- ../../nextjs-website/out:/app/build-devp/out
command: "python src/modules/create_vector_index.py --params config/params.yaml"
tty: true
depends_on:
redis:
condition: service_started
env_file: ../.env.local
networks:
- ntw

langfuse:
image: langfuse/langfuse:2
depends_on:
postgres:
condition: service_healthy
ports:
- "3001:3000"
env_file: ../.env.local
networks:
- ntw

volumes:
postgres_data:
driver: local
redis_data:
driver: local

networks:
ntw:
2 changes: 1 addition & 1 deletion apps/chatbot/docker/docker-compose-build-local.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
#!/bin/bash
docker compose --env-file .env -f docker/compose.yaml -p chatbot build
docker compose -f docker/compose.yaml -p chatbot build
2 changes: 1 addition & 1 deletion apps/chatbot/docker/docker-compose-run-create_index.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
#!/bin/bash
docker compose --env-file .env -f docker/compose.yaml -p chatbot run create_index
docker compose -f docker/compose.yaml -p chatbot run create_index
Loading

0 comments on commit f3dd9a4

Please sign in to comment.