Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Experimentation with "internal" llm #505

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 24 additions & 23 deletions src/backend/core/services/ai_services.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is a good idea to do directly in markdown.

This prompts as well should be improved, they use the same methods:
https://github.com/numerique-gouv/impress/blob/8b60bc57e6c980dfc5eed8932a3cf64244b2079b/src/backend/core/services/ai_services.py#L13-L36

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes ofc, I mention it in my commit "Please note I haven't updated yet the other prompts, let's discuss it before."

Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,29 @@
),
}


AI_TRANSLATE = (
"Translate the markdown text to {language:s}, preserving markdown formatting. "
'Return JSON: {{"answer": "your translated markdown text in {language:s}"}}. '
"Do not provide any other information."
"""
You are a professional translator for `{language:s}`.

### Guidelines:
1. **Preserve exactly as-is:**
- All formatting, markdown, symbols, tags
- Names, numbers, URLs, citations
- Code blocks and technical terms

2. **Translation Rules:**
- Use natural expressions in the target language
- Match the tone of the source text (default: professional)
- Maintain original meaning precisely
- Adapt idioms to suit the target culture
- Ensure grammatical correctness stylistic coherence

3. **Do Not:**
- Add, remove, or explain any content

Output only the translated text, keeping all original formatting intact.
"""
)


Expand All @@ -59,32 +78,14 @@ def call_ai_api(self, system_content, text):
"""Helper method to call the OpenAI API and process the response."""
response = self.client.chat.completions.create(
model=settings.AI_MODEL,
response_format={"type": "json_object"},
messages=[
{"role": "system", "content": system_content},
{"role": "user", "content": json.dumps({"markdown_input": text})},
{"role": "user", "content": text},
],
)

content = response.choices[0].message.content

try:
sanitized_content = re.sub(r'\s*"answer"\s*:\s*', '"answer": ', content)
sanitized_content = re.sub(r"\s*\}", "}", sanitized_content)
sanitized_content = re.sub(r"(?<!\\)\n", "\\\\n", sanitized_content)
sanitized_content = re.sub(r"(?<!\\)\t", "\\\\t", sanitized_content)

json_response = json.loads(sanitized_content)
except (json.JSONDecodeError, IndexError):
try:
json_response = json.loads(content)
except json.JSONDecodeError as err:
raise RuntimeError("AI response is not valid JSON", content) from err

if "answer" not in json_response:
raise RuntimeError("AI response does not contain an answer")

return json_response
return {"answer": content}

def transform(self, text, action):
"""Transform text based on specified action."""
Expand Down
3 changes: 3 additions & 0 deletions src/helm/env.d/dev/values.impress.yaml.gotmpl
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,9 @@ backend:
AWS_S3_SECRET_ACCESS_KEY: password
AWS_STORAGE_BUCKET_NAME: impress-media-storage
STORAGES_STATICFILES_BACKEND: django.contrib.staticfiles.storage.StaticFilesStorage
AI_API_KEY: **ask antoine**
AI_BASE_URL: https://albertine.beta.numerique.gouv.fr/v1/
AI_MODEL: meta-llama/Llama-3.1-8B-Instruct

migrate:
command:
Expand Down
Loading