Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on interactive run #14

Open
sreekarchigurupati opened this issue Sep 28, 2023 · 3 comments
Open

Error on interactive run #14

sreekarchigurupati opened this issue Sep 28, 2023 · 3 comments

Comments

@sreekarchigurupati
Copy link

Running the code in this manner

python -m main interactive /path/mistral-7B-v0.1/

It gives the following error

Prompt: Traceback (most recent call last):
  File "/N/soft/sles15/deeplearning/Python-3.10.9/Lib/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/N/soft/sles15/deeplearning/Python-3.10.9/Lib/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/N/project/grg_data/projects/LLMs/mistral/mistral-src/main.py", line 134, in <module>
    fire.Fire({
  File "/N/u/srchig/BigRed200/.local/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/N/u/srchig/BigRed200/.local/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/N/u/srchig/BigRed200/.local/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/N/project/grg_data/projects/LLMs/mistral/mistral-src/main.py", line 110, in interactive
    res, _logprobs = generate([prompt], transformer, tokenizer, max_tokens)
  File "/N/u/srchig/BigRed200/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
TypeError: generate() takes 3 positional arguments but 4 were given
@Victoire21
Copy link

Victoire21 commented Oct 11, 2023

Hi, @arthurmensch

I'm doing a master's degree and I'm looking for a French-speaking AI model to integrate into my project. Mistral AI could be the ideal candidate.
So I used google colab to learn more about mistral ai.
image

I successfully done each steps until when I run this code : !python -m main interactive /content/mistral-7B-v0.1/
The process of generating prompt is stopped suddenly and I didn't find how to debug it.

image

So if someone can give a hand to help me go further, it will be extremely appreciated. THanks!

@Victoire21
Copy link

@sreekarchigurupati :

As I know, there are two files involved : tokenizer.model and param.json found inside mistral-7B-v0.1 folder.
In your main.py", line 110, generate([prompt], transformer, tokenizer, max_tokens) asked for 3 arguments.
This is the actual main.py Line 110 code:
" prompt = input("Prompt: ")
res, _logprobs = generate(
[prompt],
transformer,
tokenizer,
max_tokens=max_tokens,
temperature=temperature,
)"
It show 4 arguments : transformer, tokenizer, max_tokens and temperature. So try to download the actuated folder.

@Victoire21
Copy link

Hi @arthurmensch, @alexandresablayrolles

image
J'ai essayé de débugger en utilisant pdb. J'ai remarqué que le répertoire de python recherché est : /usr/local/lib/python3.10/.
Ma question est la suivante: est-il possible de tester Mistral AI sur colab ou devrais-je obligatoirement utiliser un environnement linux? Si oui, est-ce que cette erreur n'est pas dû au fait qu'il y a des fonctionnalités python qui ne sont pas prise en compte. Merci de votre réponse.
Bonne journée!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@sreekarchigurupati @Victoire21 and others