Skip to content

Commit

Permalink
Merge pull request #63 from Arquisoft/#59
Browse files Browse the repository at this point in the history
#59 - Fixed LLM key exposure
  • Loading branch information
carlosfernandezmartinez authored Feb 28, 2025
2 parents 9f85f5d + d0f5b4a commit 9adad72
Show file tree
Hide file tree
Showing 12 changed files with 60 additions and 46 deletions.
5 changes: 2 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,14 +54,13 @@ jobs:
uses: elgohr/Publish-Docker-Github-Action@v5
env:
API_URI: http://${{ secrets.DEPLOY_HOST }}:8000
LLM_API_KEY: ${{ secrets.LLM_API_KEY }}
with:
name: arquisoft/wichat_en2b/webapp
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io
workdir: webapp
buildargs: API_URI,LLM_API_KEY
buildargs: API_URI
docker-push-authservice:
name: Push auth service Docker Image to GitHub Packages
runs-on: ubuntu-latest
Expand Down Expand Up @@ -154,4 +153,4 @@ jobs:
command: |
wget https://raw.githubusercontent.com/arquisoft/wichat_en2b/master/docker-compose.yml -O docker-compose.yml
docker compose --profile prod down
docker compose --profile prod up -d --pull always
docker compose --profile prod up -d --pull always
9 changes: 4 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ First, clone the project:
In order to communicate with the LLM integrated in this project, we need to setup an API key. Two integrations are available in this propotipe: gemini and empaphy. The API key provided must match the LLM provider used.

We need to create two .env files.
- The first one in the webapp directory (for executing the webapp using ```npm start```). The content of this .env file should be as follows:
- The first one in the llmservice directory (for executing the llmservice using ```npm start```). The content of this .env file should be as follows:
```
REACT_APP_LLM_API_KEY="YOUR-API-KEY"
LLM_API_KEY="YOUR-API-KEY"
```
- The second one located in the root of the project (along the docker-compose.yml). This .env file is used for the docker-compose when launching the app with docker. The content of this .env file should be as follows:
```
Expand All @@ -41,8 +41,7 @@ LLM_API_KEY="YOUR-API-KEY"

Note that these files must NOT be uploaded to the github repository (they are excluded in the .gitignore).

An extra configuration for the LLM to work in the deployed version of the app is to include it as a repository secret (LLM_API_KEY). This secret will be used by GitHub Action when building and deploying the application.

An extra configuration for the LLM to work in the deployed version of the app is to create the same .env file (with the LLM_API_KEY variable) in the virtual machine (in the home of the azureuser directory).

### Launching Using docker
For launching the propotipe using docker compose, just type:
Expand Down Expand Up @@ -116,4 +115,4 @@ This action uses three secrets that must be configured in the repository:
- DEPLOY_USER: user with permission to execute the commands in the remote machine.
- DEPLOY_KEY: key to authenticate the user in the remote machine.
Note that this action logs in the remote machine and downloads the docker-compose file from the repository and launches it. Obviously, previous actions have been executed which have uploaded the docker images to the GitHub Packages repository.
Note that this action logs in the remote machine and downloads the docker-compose file from the repository and launches it. Obviously, previous actions have been executed which have uploaded the docker images to the GitHub Packages repository.
12 changes: 6 additions & 6 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,10 @@ services:
container_name: llmservice-wichat_en2b
image: ghcr.io/arquisoft/wichat_en2b/llmservice:latest
profiles: ["dev", "prod"]
build: ./llmservice
env_file:
- .env
build:
context: ./llmservice
ports:
- "8003:8003"
networks:
Expand Down Expand Up @@ -71,10 +74,7 @@ services:
container_name: webapp-wichat_en2b
image: ghcr.io/arquisoft/wichat_en2b/webapp:latest
profiles: ["dev", "prod"]
build:
context: ./webapp
args:
LLM_API_KEY: ${LLM_API_KEY}
build: ./webapp
depends_on:
- gatewayservice
ports:
Expand Down Expand Up @@ -121,4 +121,4 @@ volumes:

networks:
mynetwork:
driver: bridge
driver: bridge
3 changes: 2 additions & 1 deletion llmservice/.dockerignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
node_modules
coverage
coverage
.env
2 changes: 1 addition & 1 deletion llmservice/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ COPY . .
EXPOSE 8003

# Define the command to run your app
CMD ["node", "llm-service.js"]
CMD ["node", "llm-service.js"]
12 changes: 9 additions & 3 deletions llmservice/llm-service.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ const port = 8003;

// Middleware to parse JSON in request body
app.use(express.json());
// Load enviornment variables
require('dotenv').config();

// Define configurations for different LLM APIs
const llmConfigs = {
Expand Down Expand Up @@ -71,9 +73,14 @@ async function sendQuestionToLLM(question, apiKey, model = 'gemini') {
app.post('/ask', async (req, res) => {
try {
// Check if required fields are present in the request body
validateRequiredFields(req, ['question', 'model', 'apiKey']);
validateRequiredFields(req, ['question', 'model']);

const { question, model, apiKey } = req.body;
const { question, model } = req.body;
//load the api key from an environment variable
const apiKey = process.env.LLM_API_KEY;
if (!apiKey) {
return res.status(400).json({ error: 'API key is missing.' });
}
const answer = await sendQuestionToLLM(question, apiKey, model);
res.json({ answer });

Expand All @@ -88,4 +95,3 @@ const server = app.listen(port, () => {

module.exports = server


5 changes: 4 additions & 1 deletion llmservice/llm-service.test.js
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
//set a fake api key
process.env.LLM_API_KEY = 'test-api-key';

const request = require('supertest');
const axios = require('axios');
const app = require('./llm-service');
Expand All @@ -22,7 +25,7 @@ describe('LLM Service', () => {
it('the llm should reply', async () => {
const response = await request(app)
.post('/ask')
.send({ question: 'a question', apiKey: 'apiKey', model: 'gemini' });
.send({ question: 'a question', model: 'gemini' });

expect(response.statusCode).toBe(200);
expect(response.body.answer).toBe('llmanswer');
Expand Down
15 changes: 14 additions & 1 deletion llmservice/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

21 changes: 11 additions & 10 deletions llmservice/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,14 @@
"author": "",
"license": "ISC",
"description": "",
"homepage": "https://github.com/arquisoft/wichat_en2b#readme",
"dependencies": {
"axios": "^1.7.9",
"express": "^4.21.2"
},
"devDependencies": {
"jest": "^29.7.0",
"supertest": "^7.0.0"
}
}
"homepage": "https://github.com/arquisoft/wichat_0#readme",
"dependencies": {
"axios": "^1.7.9",
"dotenv": "^16.4.7",
"express": "^4.21.2"
},
"devDependencies": {
"jest": "^29.7.0",
"supertest": "^7.0.0"
}
}
4 changes: 1 addition & 3 deletions webapp/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,12 @@ WORKDIR /app
RUN npm install --omit=dev

ARG API_URI="http://localhost:8000"
ARG LLM_API_KEY
ENV REACT_APP_API_ENDPOINT=$API_URI
ENV REACT_APP_LLM_API_KEY=$LLM_API_KEY

#Create an optimized version of the webapp
RUN npm run build
RUN npm install -g serve --production

#Execute npm run prod to run the server
CMD [ "npm", "run", "prod" ]
#CMD ["npm", "start"]
#CMD ["npm", "start"]
4 changes: 2 additions & 2 deletions webapp/src/App.js
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ function App() {
<Container component="main" maxWidth="xs">
<CssBaseline />
<Typography component="h1" variant="h5" align="center" sx={{ marginTop: 2 }}>
Welcome to the 2025 edition of the Software Architecture course
Welcome to the 2025 edition of the Software Architecture course!
</Typography>
{showLogin ? <Login /> : <AddUser />}
<Typography component="div" align="center" sx={{ marginTop: 2 }}>
Expand All @@ -35,4 +35,4 @@ function App() {
);
}

export default App;
export default App;
14 changes: 4 additions & 10 deletions webapp/src/components/Login.js
Original file line number Diff line number Diff line change
Expand Up @@ -14,22 +14,16 @@ const Login = () => {
const [openSnackbar, setOpenSnackbar] = useState(false);

const apiEndpoint = process.env.REACT_APP_API_ENDPOINT || 'http://localhost:8000';
const apiKey = process.env.REACT_APP_LLM_API_KEY || 'None';


const loginUser = async () => {
try {
const response = await axios.post(`${apiEndpoint}/login`, { username, password });

const question = "Please, generate a greeting message for a student called " + username + " that is a student of the Software Architecture course in the University of Oviedo. Be nice and polite. Two to three sentences max.";
const model = "empathy"

if (apiKey==='None'){
setMessage("LLM API key is not set. Cannot contact the LLM.");
}
else{
const message = await axios.post(`${apiEndpoint}/askllm`, { question, model, apiKey })
setMessage(message.data.answer);
}
const message = await axios.post(`${apiEndpoint}/askllm`, { question, model })
setMessage(message.data.answer);
// Extract data from the response
const { createdAt: userCreatedAt } = response.data;

Expand Down Expand Up @@ -93,4 +87,4 @@ const Login = () => {
);
};

export default Login;
export default Login;

0 comments on commit 9adad72

Please sign in to comment.