Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the ability to use LLM Providers from the Arch config #112

Merged
merged 9 commits into from
Oct 3, 2024
Merged

Conversation

junr03
Copy link
Collaborator

@junr03 junr03 commented Oct 2, 2024

No description provided.

junr03 added 4 commits October 1, 2024 11:38
Signed-off-by: José Ulises Niño Rivera <[email protected]>
Signed-off-by: José Ulises Niño Rivera <[email protected]>
Signed-off-by: José Ulises Niño Rivera <[email protected]>
Signed-off-by: José Ulises Niño Rivera <[email protected]>
@junr03 junr03 requested a review from adilhafeez October 2, 2024 19:09
junr03 added 3 commits October 2, 2024 12:11
Signed-off-by: José Ulises Niño Rivera <[email protected]>
Signed-off-by: José Ulises Niño Rivera <[email protected]>
def add_secret_key_to_llm_providers(config_yaml) :
llm_providers = []
for llm_provider in config_yaml.get("llm_providers", []):
if llm_provider['access_key'] == "mistral_access_key":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mistral_access_key should be $MISTRAL_ACCESS_KEY, same for openai provider

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and should only modify access keys if OPENAI_API_KEY is not false

Copy link
Collaborator Author

@junr03 junr03 Oct 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and should only modify access keys if OPENAI_API_KEY is not false

@adilhafeez. This was actually part of a larger question I had: what should behavior be if access keys are not correctly set. Should we panic and abort in initialization?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just send the request upstream and let it fail - there may be some providers that doesn't require access keys (e.g. locally hosted models). So setting keys should be optional in my opinion.

pub fn api_key_header(&self) -> &str {
self.api_key_header
}
#[derive(thiserror::Error, Debug)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will use these in my errors too ❤️

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this crate is excellent. I didn't want to bust it too soon. But with this I felt it was necessary.

Check out @dtolnay, a lot of very useful crates started with him. serde too

Comment on lines +51 to +54
match llm_providers.default {
Some(_) => return Err(LlmProvidersNewError::MoreThanOneDefault),
None => llm_providers.default = Some(Rc::clone(&llm_provider)),
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❤️

) -> Rc<LlmProvider> {
let maybe_provider = provider_hint.and_then(|hint| match hint {
ProviderHint::Default => llm_providers.default(),
// FIXME: should a non-existent name in the hint be more explicit? i.e, return a BAD_REQUEST?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be bad req in that case

if prompt_guard_resp.jailbreak_verdict.is_some()
&& prompt_guard_resp.jailbreak_verdict.unwrap()
{
if prompt_guard_resp.jailbreak_verdict.unwrap_or_default() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❤️

Comment on lines +693 to +697
let msg = (*self.prompt_guards)
.as_ref()
.and_then(|pg| pg.jailbreak_on_exception_message())
.unwrap_or("Jailbreak detected. Please refrain from discussing jailbreaking.");
return self.send_server_error(msg.to_string(), Some(StatusCode::BAD_REQUEST));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❤️

Signed-off-by: José Ulises Niño Rivera <[email protected]>
@junr03
Copy link
Collaborator Author

junr03 commented Oct 2, 2024

@adilhafeez I get this in the function calling demo: is this correct?
Screenshot 2024-10-02 at 16 26 51

Signed-off-by: José Ulises Niño Rivera <[email protected]>
access_key: $MISTRAL_API_KEY
model: "mistral-8x7b"

- name: "MistralLocal7b"
provider: "local"
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed that the schema, or any other part of the code deals with local providers. This PR does not enable that use case either.

I think the endpoint field and the provider field should be an enum making the choice mutually exclusive? Lmk what you think, and I can iterate in a subsequent PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes that would be nice cleanup

@adilhafeez
Copy link
Contributor

@adilhafeez I get this in the function calling demo: is this correct?

Yes model makes mistake sometimes. Try asking "what is the weather in chicago?"

@junr03 junr03 merged commit 8ea917a into main Oct 3, 2024
2 checks passed
@junr03 junr03 deleted the llm-config branch October 8, 2024 17:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants