Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG: ModuleNotFoundError: No module named 'mistral_inference.transformer' ​ #202

Open
yafangwang9 opened this issue Jul 23, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@yafangwang9
Copy link

Python -VV

from mistral_inference.transformer import Transformer
from mistral_inference.generate import generate

from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
from mistral_common.protocol.instruct.messages import UserMessage
from mistral_common.protocol.instruct.request import ChatCompletionRequest

tokenizer = MistralTokenizer.from_file(f"{mistral_models_path}/tekken.json")
model = Transformer.from_folder(mistral_models_path)

prompt = "How expensive would it be to ask a window cleaner to clean all windows in Paris. Make a reasonable guess in US Dollar."

completion_request = ChatCompletionRequest(messages=[UserMessage(content=prompt)])

tokens = tokenizer.encode_chat_completion(completion_request).tokens

out_tokens, _ = generate([tokens], model, max_tokens=64, temperature=0.35, eos_id=tokenizer.instruct_tokenizer.tokenizer.eos_id)
result = tokenizer.decode(out_tokens[0])

print(result)

Pip Freeze

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Input In [3], in <cell line: 1>()
----> 1 from mistral_inference.transformer import Transformer
      2 from mistral_inference.generate import generate
      4 from mistral_common.tokens.tokenizers.mistral import MistralTokenizer

ModuleNotFoundError: No module named 'mistral_inference.transformer'

Reproduction Steps

I use Mistral_inference for mistral-nemo ,got this issue

Expected Behavior

https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407

Additional Context

No response

Suggested Solutions

No response

@yafangwang9 yafangwang9 added the bug Something isn't working label Jul 23, 2024
@bhargavyagnik
Copy link
Contributor

Did you build mistral-inference from source or pip install ? Can you do pip freeze to find the details

@yafangwang9
Copy link
Author

yafangwang9 commented Jul 24, 2024 via email

@bhargavyagnik
Copy link
Contributor

You can try to install mistral-inference again in this Cloud env.
Try - pip install mistral-inference in the environment.
Check which -a pip to see where its being installed might be that its being installed somewhere else.

@yafangwang9
Copy link
Author

yafangwang9 commented Jul 24, 2024 via email

@ShadyPi
Copy link

ShadyPi commented Jul 24, 2024

I met the same issue and I found that it requires peotry to properly install mistral-inference. I verified this on 3 different servers: if you install peotry first and then pip install mistral-inference, it works perfectly.

@shulin16
Copy link

Since the mistral_inference/ was put under src/ folder, you just need to append the path to system path by using

import sys
sys.path.append('$YOUR_REPO_DIR/src/')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants