diff --git a/ChatQnA/README.md b/ChatQnA/README.md index 4f56abf054..f01e252a53 100644 --- a/ChatQnA/README.md +++ b/ChatQnA/README.md @@ -53,6 +53,7 @@ To set up environment variables for deploying ChatQnA services, follow these ste ### Quick Start: 2.Run Docker Compose Select the compose.yaml file that matches your hardware. + CPU example: ```bash @@ -69,9 +70,13 @@ docker pull opea/chatqna:latest docker pull opea/chatqna-ui:latest ``` -If you want to build docker by yourself, please refer to `built from source`: [Guide](docker_compose/intel/cpu/xeon/README.md). +In following cases, you could build docker image from source by yourself. + +- Failed to download the docker image. + +- Use the latest or special version. -> Note: The optional docker image **opea/chatqna-without-rerank:latest** has not been published yet, users need to build this docker image from source. +Please refer to the 'Build Docker Images' in [Guide](docker_compose/intel/cpu/xeon/README.md). ### QuickStart: 3.Consume the ChatQnA Service diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README.md b/ChatQnA/docker_compose/intel/cpu/xeon/README.md index a6e26c056d..3bab9a9358 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README.md @@ -47,9 +47,13 @@ docker pull opea/chatqna:latest docker pull opea/chatqna-ui:latest ``` -If you want to build docker by yourself, please refer to 'Build Docker Images' in below. +In following cases, you could build docker image from source by yourself. -> Note: The optional docker image **opea/chatqna-without-rerank:latest** has not been published yet, users need to build this docker image from source. +- Failed to download the docker image. + +- Use the latest or special version. + +Please refer to 'Build Docker Images' in below. ## QuickStart: 3.Consume the ChatQnA Service @@ -69,52 +73,25 @@ For detailed information about these instance types, you can refer to this [link After launching your instance, you can connect to it using SSH (for Linux instances) or Remote Desktop Protocol (RDP) (for Windows instances). From there, you'll have full access to your Xeon server, allowing you to install, configure, and manage your applications as needed. -**Certain ports in the EC2 instance need to opened up in the security group, for the microservices to work with the curl commands** +### Network Port & Security -> See one example below. Please open up these ports in the EC2 instance based on the IP addresses you want to allow +- Access the ChatQnA UI by web browser -``` -redis-vector-db -=============== -Port 6379 - Open to 0.0.0.0/0 -Port 8001 - Open to 0.0.0.0/0 - -tei_embedding_service -===================== -Port 6006 - Open to 0.0.0.0/0 - -embedding -========= -Port 6000 - Open to 0.0.0.0/0 - -retriever -========= -Port 7000 - Open to 0.0.0.0/0 - -tei_xeon_service -================ -Port 8808 - Open to 0.0.0.0/0 - -reranking -========= -Port 8000 - Open to 0.0.0.0/0 - -tgi-service or vLLM_service -=========== -Port 9009 - Open to 0.0.0.0/0 - -llm -=== -Port 9000 - Open to 0.0.0.0/0 - -chaqna-xeon-backend-server -========================== -Port 8888 - Open to 0.0.0.0/0 - -chaqna-xeon-ui-server -===================== -Port 5173 - Open to 0.0.0.0/0 -``` + It supports to access by `80` port. Please confirm the `80` port is opened in the firewall of EC2 instance. + +- Access the microservice by tool or API + + 1. Login to the EC2 instance and access by **local IP address** and port. + + It's recommended and do nothing of the network port setting. + + 2. Login to a remote client and access by **public IP address** and port. + + You need to open the port of the microservice in the security group setting of firewall of EC2 instance setting. + + For detailed guide, please refer to [Validate Microservices](#validate-microservices). + + Note, it will increase the risk of security, so please confirm before do it. ## 🚀 Build Docker Images @@ -325,6 +302,7 @@ docker compose -f compose_vllm.yaml up -d ### Validate Microservices +Note, when verify the microservices by curl or API from remote client, please make sure the **ports** of the microservices are opened in the firewall of the cloud node. Follow the instructions to validate MicroServices. For details on how to verify the correctness of the response, refer to [how-to-validate_service](../../hpu/gaudi/how_to_validate_service.md). diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md index fab6f1046c..82df9f37d9 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md @@ -48,9 +48,13 @@ docker pull opea/chatqna:latest docker pull opea/chatqna-ui:latest ``` -If you want to build docker by yourself, please refer to 'Build Docker Images' in below. +In following cases, you could build docker image from source by yourself. -> Note: The optional docker image **opea/chatqna-without-rerank:latest** has not been published yet, users need to build this docker image from source. +- Failed to download the docker image. + +- Use the latest or special version. + +Please refer to 'Build Docker Images' in below. ## QuickStart: 3.Consume the ChatQnA Service diff --git a/ChatQnA/docker_compose/nvidia/gpu/README.md b/ChatQnA/docker_compose/nvidia/gpu/README.md index dd21def278..01cd988147 100644 --- a/ChatQnA/docker_compose/nvidia/gpu/README.md +++ b/ChatQnA/docker_compose/nvidia/gpu/README.md @@ -48,9 +48,13 @@ docker pull opea/chatqna:latest docker pull opea/chatqna-ui:latest ``` -If you want to build docker by yourself, please refer to 'Build Docker Images' in below. +In following cases, you could build docker image from source by yourself. -> Note: The optional docker image **opea/chatqna-without-rerank:latest** has not been published yet, users need to build this docker image from source. +- Failed to download the docker image. + +- Use the latest or special version. + +Please refer to 'Build Docker Images' in below. ## QuickStart: 3.Consume the ChatQnA Service