Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add observability #8

Closed
geoand opened this issue Nov 13, 2023 · 6 comments
Closed

Add observability #8

geoand opened this issue Nov 13, 2023 · 6 comments

Comments

@geoand
Copy link
Collaborator

geoand commented Nov 13, 2023

Also, take a look at https://smith.langchain.com/

@geoand
Copy link
Collaborator Author

geoand commented Nov 29, 2023

  • Metrics for LLM calls
  • Introduce audit service concept that will be used by the logger if it exists
    ** We will need to introduce the proper context
    ** We need to make sure we don't log sensitive info
  • Add Tracing
    ** We need spans per tool call in order to get insights into the entire lifecycle of the request

@geoand
Copy link
Collaborator Author

geoand commented Dec 4, 2023

@cescoffier what do you have in mind for the audit service?

@cescoffier
Copy link
Collaborator

@geoand

The audit service would be an optional interface that the user can implement to store the interactions with the LLMs. It's like the access.log but for LLM - and the format should be defined by the users (as there is a good chance it needs to go in a DB).

Basically, the interface will do something like:

Audit initAudit(String systemMessage, String userMessage)

with Audit the object on which we can report the additional messages and RAG:

Audit addLLMToApplicationMessage(...)
Audit addApplicationToLLMMessage(...)
Audit addRelevantDocument(...)
void onCompletion(); // Complete the audit -> persist it, log it
void onFailure(Throwable t); // The LLM interaction failed

Each recorded step should have the message direction (LLM -> App or App -> LLM) and the timestamp.

Unknowns:

  • Should audi be part of the same transaction as the LLM interaction - I would say yes. But it's rather convoluted to force that behavior in a first step.

@cescoffier
Copy link
Collaborator

BTW, I went with this design because the multiple message nature is tricky. Also, I would add the memory Id is available.

@geoand
Copy link
Collaborator Author

geoand commented Dec 4, 2023

#83 is what I am starting with.

I am thinking of leaving most things unimplemented and letting the users decide what they want to do.

geoand added a commit that referenced this issue Dec 4, 2023
geoand added a commit that referenced this issue Dec 4, 2023
geoand added a commit that referenced this issue Dec 4, 2023
geoand added a commit that referenced this issue Dec 5, 2023
geoand added a commit that referenced this issue Dec 5, 2023
geoand added a commit that referenced this issue Dec 5, 2023
geoand added a commit that referenced this issue Dec 6, 2023
geoand added a commit that referenced this issue Dec 6, 2023
geoand added a commit that referenced this issue Dec 6, 2023
@geoand
Copy link
Collaborator Author

geoand commented Jan 30, 2024

Closing as we already have this

@geoand geoand closed this as completed Jan 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants