-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Major LLM services and UI improvement #23
Conversation
.github/workflows/coqpilot.yml
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to add .nvmrc file and fix the node version
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup, made it pretty much the same as you did in ai_agents_server
and also used it on CI
return text.replace("\n", "\\n"); | ||
} | ||
|
||
static deserealizeFromString(rawRecord: string): [LoggerRecord, string] { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is the textual file format best for the logs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a good question to be answered indeed, including the docs. I left a note at GenerationsLogger
's docs about the possible performance overhead and how to resolve the issue if it arises in practice one day.
No worries for now though, neither tests nor I didn't experience any problems, on the opposite, only +vibes from working with human-readable logs during debugging 🕊
cc9087c
to
0bc235a
Compare
Changes summary
⏳ Estimate time for LLM services to become available via logging proof-generation requests
GenerationsLogger
🕵️ Support proper logging and error handling inside
LLMService
-sLLMService
errors:LLMServiceError
,ConfigurationError
,RemoteConnectionError
,GenerationFailedError
.LLMService
implementation:EventsLogger
andGenerationsLogger
.LLMServiceRequest
to make data transfer coherent between different logic modules.LLMService
-s and UI to report errors.🏯 Rework
LLMService
arhcitectureLLMServiceInternal
that:LLMService
to make them both concise and powerful.LLMService
classes based on recursive generics.LLMService
-s safer and easier.LLMService
architecture.✅ Test everything
🚀 Fix & improve CI
ocaml
andopam
dependencies.🤝 Improve LLM services' parameters: their naming, transparency, description
modelId
to distinguish a model identifier from the name of an OpenAI / Grazie model.newMessageMaxTokens
tomaxTokensToGenerate
for greater clarity.defaultChoices
toModelParams
to make its resolution more transparent in the code.🕊 Rework and significantly improve settings validation
SettingsValidationError
,modelId
-s;💚🤖 Improve interaction with OpenAI
OpenAIService
:maxTokensToGenerate
appropriately.🌿 Improve code quality
CoqPilot
into smaller pieces.Error
coherent.any
-s.