LLMs-Service is a service designed to solve multi-choice question answering tasks with several different LLMs. This service interacts only with Common-Backend using Kafka.
- Built docker image of LLMs-Service from project root directory:
docker build -t llms-service .
- Run docker image:
docker run llms-service
- Create new branch from main:
git branch <YOUR_NICKNAME>:<FEATURE_NAME>
- Checkout to your branch:
git checkout <BRANCH_NAME_FROM_POINT_1>
- Write code
- Test code on development environment
- Create Pull Request
- Wait for approve