-
Notifications
You must be signed in to change notification settings - Fork 854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LLAMA 502] llama command are not getting recognized #187
Comments
I have the same problem |
I've found that it silently fails if anything is wrong with your env. Try ensuring that you have installed python3.10, then create a python3.10 venv and build from source with |
Create an environment with python3.10 and install it: conda create -n stack python=3.10
pip install llama-stack Refer to: https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.md |
Thank you, I believe this is the correct solution. |
I had the same issue. The only thing that worked for me was installing python 3.10.11, I was previously on 3.9 |
using python=3.10 did worked for me as well |
I have run pip install llama-stack, following readme file, though it is installed successfully ,
"llama model list --show-all
llama: command not found"
But I see llama commnad are not getting recognized, need to get this resolved toi proceed further
//*****************************
Linux terminal#:
/llama-stack$ pip install llama-stack/llama-stack$ llama model list --show-allRequirement already satisfied: llama-stack in /home/seema1/.local/lib/python3.8/site-packages (0.0.1a5)
Requirement already satisfied: httpx<1,>=0.23.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (0.27.2)
Requirement already satisfied: pydantic<3,>=1.9.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (2.9.2)
Requirement already satisfied: distro<2,>=1.7.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.9.0)
Requirement already satisfied: sniffio in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.3.1)
Requirement already satisfied: anyio<5,>=3.5.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.5.2)
Requirement already satisfied: typing-extensions<5,>=4.7 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.12.2)
Requirement already satisfied: idna in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2.8)
Requirement already satisfied: httpcore==1.* in /home/seema1/.local/lib/python3.8/site-packages (from httpx<1,>=0.23.0->llama-stack) (1.0.6)
Requirement already satisfied: certifi in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2019.11.28)
Requirement already satisfied: pydantic-core==2.23.4 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (2.23.4)
Requirement already satisfied: annotated-types>=0.6.0 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (0.7.0)
Requirement already satisfied: exceptiongroup>=1.0.2; python_version < "3.11" in /home/seema1/.local/lib/python3.8/site-packages (from anyio<5,>=3.5.0->llama-stack) (1.2.2)
Requirement already satisfied: h11<0.15,>=0.13 in /home/seema1/.local/lib/python3.8/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->llama-stack) (0.14.0)
Liunx terminal#:
llama: command not found
The text was updated successfully, but these errors were encountered: