-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Wrong Perplexity model called (llama 3.0 ->3.1) #642
Comments
Thanks @JJsilvera1, fixed in Big-AGI-2. |
Okay, thank you! |
@enricoros I signed up for the beta but since I don't have access to it doesn't really help 😂 |
@JJsilvera1 - I can have a pre-pilot program and admit you to the 2 alpha server. V2 is incredibly better than V1, multimodal, resilient, faster, shows token usages, pricings, supports new architectures, and a few new large features I'm not mentioning yet. etc. But some features are still not ported to V2, so it's taking me time and I want the top quality. I've marked your email for pre-access, please follow up with me anytime. |
Hello, I just signed up for the beta just now |
@enricoros hello, I actually signed up but also I wanted to actually make a feature request that maybe most people don't recommend. I just want to be able to have more than one window of big AGI open. It would be cool because then I can run multiple prompts and cross reference them in different tabs, but it seems like I get an error anytime I duplicate the tab that one sessions are already open |
I agree it would be great. It's in the backlog of features, and we'll get to it once we scale the operation. This will require a centralized system / server. |
Hi @enricoros can I get access to v2 please? |
+1 |
Hi @enricoros can I get access to v2 as well please? Or exist any other method to get rid on this error? [Service Issue] Perplexity: Bad Request (400): "error": { "message": "Invalid model 'llama-3-sonar-large-32k-online'. Permitted models can be found in the documentation at https://docs.perplexity.ai/docs/model-cards.", "type": "invalid_model", "code": 400 } |
hello. When I use perplexity the call is sending to the wrong model type.
[Service Issue] Perplexity: Bad Request (400): "error": { "message": "Invalid model 'llama-3-sonar-large-32k-chat'. Permitted models can be found in the documentation at https://docs.perplexity.ai/docs/model-cards.", "type": "invalid_model", "code": 400 }
should be 3.1 models.
I just added api access today
The text was updated successfully, but these errors were encountered: