Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fail to load llama_cpp on a fresh install #144

Closed
benbender opened this issue May 11, 2024 · 1 comment
Closed

Fail to load llama_cpp on a fresh install #144

benbender opened this issue May 11, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@benbender
Copy link

benbender commented May 11, 2024

Describe the bug
After a fresh install of home-llm, the integration fails to load because it seems to be unable to find the llama_cpp-module even if it says that it successfully installed the module.

Home-Assistant is running inside Podman on a small Server with an Intel(R) Celeron(R) N5105 @ 2.00GHz-CPU.

Expected behavior
The integration should load successfully :)

Logs

2024-05-11 20:17:20.609 INFO (SyncWorker_36) [custom_components.llama_conversation.agent] Using model file '/config/media/models/models--acon96--Home-3B-v3-GGUF/snapshots/1f20ec6ddf2cbf9e6996c9f8a524bc5d80abb42e/Home-3B-v3.q3_k_m.gguf'
2024-05-11 20:17:22.868 INFO (SyncWorker_36) [homeassistant.util.package] Attempting install of https://github.com/acon96/home-llm/releases/download/v0.2.17/llama_cpp_python-0.2.70-cp312-cp312-musllinux_1_2_x86_64.whl
2024-05-11 20:17:30.569 INFO (SyncWorker_36) [custom_components.llama_conversation.utils] llama-cpp-python successfully installed from GitHub release
2024-05-11 20:17:32.774 ERROR (MainThread) [homeassistant.config_entries] Error setting up entry LLM Model 'acon96/Home-3B-v3-GGUF' (llama.cpp) for llama_conversation
Traceback (most recent call last):
  File "/config/custom_components/llama_conversation/agent.py", line 546, in _load_model
    self.llama_cpp_module = importlib.import_module("llama_cpp")
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/util/loop.py", line 144, in protected_loop_func
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 575, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 67, in async_setup_entry
    agent = await hass.async_add_executor_job(create_agent, backend_type)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 63, in create_agent
    return agent_cls(hass, entry)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 139, in __init__
    self._load_model(entry)
  File "/config/custom_components/llama_conversation/agent.py", line 554, in _load_model
    self.llama_cpp_module = importlib.import_module("llama_cpp")
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/util/loop.py", line 144, in protected_loop_func
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp'

PS: Thanks for your awesome work for the community! <3

@benbender benbender added the bug Something isn't working label May 11, 2024
@benbender
Copy link
Author

Duplicate #140

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant