Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Info retrieval and LLM function calling with parameter #341

Open
eric-gardyn opened this issue Feb 20, 2024 · 5 comments
Open

Info retrieval and LLM function calling with parameter #341

eric-gardyn opened this issue Feb 20, 2024 · 5 comments

Comments

@eric-gardyn
Copy link
Contributor

I successfully added a function calling to my LLM (using mongo-chatbot-server)
On the call to /conversations (and /conversations/:id/messages), I pass a user ID parameter.
I store it as customData in the conversations collection.
When the user's query is about a personal request, I am able to have the LLM invoke my function.
However, I would like to know the context (user ID) so I can make a query from another MongoDB collection (or a SQL DB).
Do you have a best practice approach?

@eric-gardyn
Copy link
Contributor Author

I found a "hacky" way by getting the parameter from the 'request' object which is passed to the tool 'call' method, but it feels a little weird, no?

@mongodben
Copy link
Collaborator

your "hacky" way of doing this is the way that i designed it to work 😄

do you have any thoughts on a way that'd feel better to you?

my though with this implemention, passing the request, is in case the request contains credentials like an access token that you don't want to persist to custom user data.


an alternative that's not currently possible would be to parse any request credentials in middleware and include anything that you want for protected actions in the Response.locals, and then use that in the tool call. With that being said, the Response obj isn't currently accessible from the tool call function, so we'd need to add that logic.

@eric-gardyn
Copy link
Contributor Author

I was thinking I could get the CustomData (since it is already parsing the request and augmenting it with my own data)

@mongodben
Copy link
Collaborator

I was thinking I could get the CustomData (since it is already parsing the request and augmenting it with my own data)

yeah that could work if whatever you're looking to include in the request you can also safely store in your DB.

@eric-gardyn
Copy link
Contributor Author

my goal is to be able to retrieve user information that can be fed to the LLM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants