Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error found when using hf api #9

Open
Jiopro opened this issue May 20, 2024 · 1 comment
Open

Error found when using hf api #9

Jiopro opened this issue May 20, 2024 · 1 comment

Comments

@Jiopro
Copy link

Jiopro commented May 20, 2024

When I tried to call :

llm = NanoLLM.from_pretrained(
   model="TinyLlama/TinyLlama-1.1B-Chat-v1.0",
   api='hf',                              
   api_token='mytoken',               
   quantization='q4f16_ft',           
)

I got:

Traceback (most recent call last):
File "/root/nanollm.py", line 6, in
llm = NanoLLM.from_pretrained(
File "/opt/NanoLLM/nano_llm/nano_llm.py", line 74, in from_pretrained
model = HFModel(model_path, **kwargs)
File "/opt/NanoLLM/nano_llm/models/hf.py", line 19, in init
super(HFModel, self).init(**kwargs)
TypeError: NanoLLM.init() missing 1 required positional argument: 'model_path'

Here is the original code from nano_llm/models/hf.py:

class HFModel(NanoLLM):
    """
    Huggingface Transformers model
    """
    def __init__(self, model_path, load=True, init_empty_weights=False, **kwargs):
        """
        Initializer
        """
        super(HFModel, self).__init__(**kwargs)

The issue in the initial code is that the model_path argument is not being passed to the super class __init__ method. The corrected code snippet ensures that model_path is correctly passed to the NanoLLM class during initialization. Here is the corrected version of the class definition:

class HFModel(NanoLLM):
    """
    Huggingface Transformers model
    """
    def __init__(self, model_path, load=True, init_empty_weights=False, **kwargs):
        """
        Initializer
        """
        super(HFModel, self).__init__(model_path, **kwargs)

Hope this helps!

@dusty-nv
Copy link
Owner

dusty-nv commented May 30, 2024

Thanks @Jiopro , --api=hf is working again with 9a0ba67 and will be in the 24.6 release of NanoLLM 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants