-
-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: changing model resets the endpoint but not the UI #133
Comments
|
|
And another piece of advice. Do not use "@" mentions. It is the most terrible thing you can do because it sends me an extra notifications. Thanks! |
I don't know how you get endpoint unset. If you removed endpoint then I'm not obliged to inform you that endpoint have been removed. If you have specified settings then share it with me. Otherwise this report will remain unclear and will be closed. Before providing your answer please make sure your answer is relevant and clear. Nobody pay me to answer and support users 24/7 even during weekends. I'm doing it only because I'm care about users and want them to be satisfied. But when the users are fluiding me with infinite numbers of bug reports and seeing only negative in my work then I'm loosing motivation to support such users. Today is Saturday (Not a working day in my country so please have consience). Sorry, I blew up... |
About 3. I think I understand a bit better what's going on. Starting from mistral endpoint for ex, when I switch model to gpt-4o I have a notification that says "API has changed to support this model" the if I go back in I indeed see that the endpoint is now set to openai. But if I switch from gpt-4o to gpt-4 for example I still see the same notification even though the endpoint shoumd be the same. And moreover, when I switch models of openrouter, the endpoints always end up "unset" as I showed in the above image.
How does it do to check which API to use? Because I don't understand how I can specify what model should be assigned to which endpoint.
I would have shared my settings but I have no idea what settings would be relevant to that. I don't think it's relevant to show my endpoints because they are all working. Same for my list of favorite models. |
Issue #127 is a good example why sharing the settings is a good idea and how it can help to resolve the issue. |
Models like openai/gpt-4o or google/gemini-1.5 belongs to OpenRouter because model name consists of provider/modelname. Model gpt-3.5-turbo (without /) is a model of a single provider (OpenAI API endpoint) I don't want to add million clarifications and instructions because it will fluid and load the UI. I wany humans being humans and use the main things making us apart of animals - thinking (If model name contains provider name in their name then logically it will belong to the API endpoint that support multiple models from different providers). I'm here not to teach and explain obvious things. |
I didn't see your answer in time before posting my last message so:
I don't know what settings would be relevant. The endpoints and models are all working fine. Can you tell me what to provide? I don't see errors in the logs.
I'm really sorry you take it that way. I am too working a lot in the week (medical student) and the week ends are when I have more time to provide feedback. I don't want you to feel that you have to answer quickly or on weekends of course! And let me insist again: this is your issue section I've told you more than anyone on github how good I think of your app. This is an ISSUE section, what am I supposed to post besides issues?! Bug reports are not a criticism of your value as a human but a feedback to help the dev and avoid issues for other users. Mental health of developpers is a concern to me but I don't see how you can think a happy heavy user that finds a bug is "seeing only the negative in my work". I really don't want you to lose motivation especially when I am a HAPPY USER. Speakgpt is providing great value to the world and that is thanks to you!
I made a video to make it more clear: recording_20240622_125944.mp4As you can see changing just the model unsets the endpoint. |
This is why I will not trust your report without providing a full set of settings + endpoints: Screenrecorder-2024-06-22-13-09-29-908.mp4 |
I think I figured it out : I removed all the favorites and re added them and it seems I don't have the issue anymore. Maybe it was caused because I renamed or changed endpoints at some time? Does that mean that the favorites are linked to an endpoint name and not url? If so maybe the favourites that don't correspond to an endpoint anymore should be removed or greyed out instead of unsetting the endpoint. |
Hi! I hope you're well :)!
I had many issues when fiddling out with endpoints to try to make Claude 3.5 Sonnet working (I finally only made it work with openrouter because anthropic's API does not seem to follow openai's api enough) and finally understood that part of the issue is that if I change the model, the endpoint gets reset but the UI is not updated.
To reproduce:
I think it might be better to not reset the endpoint because endpoints can have several model we want to use, especially with openrouter.
Additionally, the error message I get when trying to chat with a model is pretty obscure and appears after a long time (probably because timeout) :
Cleartext HTTP traffic to localhost not permitted
. I think it might be a good idea to catch the error earlier by telling that the endpoint is unset.And while I'm at it: maybe the friendly name for each endpoint could be displayed instead of the url in the settings and quick settings? But that's the cherry on top!
Have a nice day @AndraxDev !
The text was updated successfully, but these errors were encountered: