Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'Gpt4AllWebUI' object has no attribute 'chatbot_bindings' #183

Open
Sk1rm opened this issue May 12, 2023 · 7 comments
Open

Comments

@Sk1rm
Copy link

Sk1rm commented May 12, 2023

Expected Behavior

Just wrote my first message to the standard model, expecting a answer

Current Behavior

No Error shown in the UI, in the terminal the error is "AttributeError: 'Gpt4AllWebUI' object has no attribute 'chatbot_bindings'"

Steps to Reproduce

I just installed it like shown in the tutorial video

Context

I am using Python 3.10.11 on Debian 11

Screenshots

grafik

@ParisNeo
Copy link
Owner

This means that the model was not loaded. This can happen if your configuration points to an non existant model. You need to verify in configs/config_local.com that the right backend and model are selected.

@Sk1rm
Copy link
Author

Sk1rm commented May 12, 2023

Thanks for your fast response. So I tried to configure it, but it is showing same error
grafik
Do I have to give the whole path for the model or how do I do it?

@Sk1rm Sk1rm closed this as completed May 12, 2023
@Sk1rm Sk1rm reopened this May 12, 2023
@Sk1rm
Copy link
Author

Sk1rm commented May 12, 2023

Sorry closed it accidently

@ParisNeo
Copy link
Owner

Thanks for your fast response. So I tried to configure it, but it is showing same error grafik Do I have to give the whole path for the model or how do I do it?

I think i see the problem. The gpt4all backend is new and i think some models are not working properly on it. Select the pyllamacpp backend from settings or in the configuration file and put the model in models/llamacpp folder and restart the app

@Sk1rm
Copy link
Author

Sk1rm commented May 13, 2023

Ok thanks, that was the problem. But I encountered another issue. In the old ui everything works fine, but after I switched to the new ui and changed the backend to llamacpp I can't select a model. I get the confirmation after changing the settings but then I can't select a model from the dropdown menu because there are no models shown. And after reloading the site the backend always switches automatically to gpt4all, even though I set it to llamacpp in the config.

@jadenkiu
Copy link
Contributor

Weird, on mine it automatically uses llama_cpp (probably because I don't have any models in the gpt_4all folder) and it works completely fine.

@ParisNeo
Copy link
Owner

Ok thanks, that was the problem. But I encountered another issue. In the old ui everything works fine, but after I switched to the new ui and changed the backend to llamacpp I can't select a model. I get the confirmation after changing the settings but then I can't select a model from the dropdown menu because there are no models shown. And after reloading the site the backend always switches automatically to gpt4all, even though I set it to llamacpp in the config.

Hi, the new ui is work in progress. I had some health issues the last few days so I had to stop working on it. I hope these problems will be fixed over time. I'm doing my best, but unfortunately I can't control my health :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants