Skip to content
This repository has been archived by the owner on May 12, 2023. It is now read-only.

Can't construct neither GPT4All not GPT4All_J (models are loaded) #105

Open
wiinnie-the-pooh opened this issue May 8, 2023 · 1 comment

Comments

@wiinnie-the-pooh
Copy link

wiinnie-the-pooh commented May 8, 2023

Running on Windows. Use miniconda.
Corresponding stack trace

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\configuration_utils.py:658 in            │
│ _get_config_dict                                                                                 │
│                                                                                                  │
│   655 │   │                                                                                      │
│   656 │   │   try:                                                                               │
│   657 │   │   │   # Load config dict                                                             │
│ ❱ 658 │   │   │   config_dict = cls._dict_from_json_file(resolved_config_file)                   │
│   659 │   │   │   config_dict["_commit_hash"] = commit_hash                                      │
│   660 │   │   except (json.JSONDecodeError, UnicodeDecodeError):                                 │
│   661 │   │   │   raise EnvironmentError(                                                        │
│                                                                                                  │
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\configuration_utils.py:745 in            │
│ _dict_from_json_file                                                                             │
│                                                                                                  │
│   742 │   @classmethod                                                                           │
│   743def _dict_from_json_file(cls, json_file: Union[str, os.PathLike]):                     │
│   744 │   │   with open(json_file, "r", encoding="utf-8") as reader:                             │
│ ❱ 745 │   │   │   text = reader.read()                                                           │
│   746 │   │   return json.loads(text)                                                            │
│   747 │                                                                                          │
│   748def __eq__(self, other):                                                               │
│ in decode:322                                                                                    │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe0 in position 4: invalid continuation byte

During handling of the above exception, another exception occurred:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ C:\Users\apo\source\repos\nomic\main.py:11 in <module>                                           │
│                                                                                                  │
│    8                                                                                             │
│    9 from nomic.gpt4all import GPT4AllGPU as GPT4                                                │
│   10 # from pygpt4all import GPT4All as GPT4                                                     │
│ ❱ 11 m = GPT4(LLAMA_PATH)                                                                        │
│   12 config = {'num_beams': 2,                                                                   │
│   13 │   │     'min_new_tokens': 10,                                                             │
│   14 │   │     'max_length': 100,                                                                │
│                                                                                                  │
│ C:\Users\apo\source\repos\nomic\nomic\gpt4all\gpt4all.py:26 in __init__                          │
│                                                                                                  │
│    23 │   │   self.model_path = llama_path                                                       │
│    24 │   │   self.tokenizer_path = llama_path                                                   │
│    25 │   │   self.lora_path = 'nomic-ai/vicuna-lora-multi-turn_epoch_2'                         │
│ ❱  26 │   │   self.model = AutoModelForCausalLM.from_pretrained(self.model_path,                 │
│    27 │   │   │   │   │   │   │   │   │   │   │   │   │   │     device_map="auto",               │
│    28 │   │   │   │   │   │   │   │   │   │   │   │   │   │     torch_dtype=torch.float16)       │
│    29 │   │   self.tokenizer = AutoTokenizer.from_pretrained(self.tokenizer_path)                │
│                                                                                                  │
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\models\auto\auto_factory.py:441 in       │
│ from_pretrained                                                                                  │
│                                                                                                  │
│   438 │   │   │   if kwargs_copy.get("torch_dtype", None) == "auto":                             │
│   439 │   │   │   │   _ = kwargs_copy.pop("torch_dtype")                                         │
│   440 │   │   │                                                                                  │
│ ❱ 441 │   │   │   config, kwargs = AutoConfig.from_pretrained(                                   │
│   442 │   │   │   │   pretrained_model_name_or_path,                                             │
│   443 │   │   │   │   return_unused_kwargs=True,                                                 │
│   444 │   │   │   │   trust_remote_code=trust_remote_code,                                       │
│                                                                                                  │
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\models\auto\configuration_auto.py:916 in │
│ from_pretrained                                                                                  │
│                                                                                                  │
│   913 │   │   kwargs["_from_auto"] = True                                                        │
│   914 │   │   kwargs["name_or_path"] = pretrained_model_name_or_path                             │
│   915 │   │   trust_remote_code = kwargs.pop("trust_remote_code", False)                         │
│ ❱ 916 │   │   config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_n   │
│   917 │   │   if "auto_map" in config_dict and "AutoConfig" in config_dict["auto_map"]:          │
│   918 │   │   │   if not trust_remote_code:                                                      │
│   919 │   │   │   │   raise ValueError(                                                          │
│                                                                                                  │
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\configuration_utils.py:573 in            │
│ get_config_dict                                                                                  │
│                                                                                                  │
│   570 │   │   """                                                                                │
│   571 │   │   original_kwargs = copy.deepcopy(kwargs)                                            │
│   572 │   │   # Get config dict associated with the base config file                             │
│ ❱ 573 │   │   config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwar   │
│   574 │   │   if "_commit_hash" in config_dict:                                                  │
│   575 │   │   │   original_kwargs["_commit_hash"] = config_dict["_commit_hash"]                  │
│   576                                                                                            │
│                                                                                                  │
│ C:\miniconda3\envs\nomic\Lib\site-packages\transformers\configuration_utils.py:661 in            │
│ _get_config_dict                                                                                 │
│                                                                                                  │
│   658 │   │   │   config_dict = cls._dict_from_json_file(resolved_config_file)                   │
│   659 │   │   │   config_dict["_commit_hash"] = commit_hash                                      │
│   660 │   │   except (json.JSONDecodeError, UnicodeDecodeError):                                 │
│ ❱ 661 │   │   │   raise EnvironmentError(                                                        │
│   662 │   │   │   │   f"It looks like the config file at '{resolved_config_file}' is not a val   │
│   663 │   │   │   )                                                                              │
│   664                                                                                            │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
OSError: It looks like the config file at 'C:\Users\apo\Downloads\ggml-gpt4all-j.bin' is not a valid JSON file.
@wiinnie-the-pooh wiinnie-the-pooh changed the title Can't consrut the Can't construct neither GPT4All not GPT4All_J (models are loaded) May 8, 2023
@g0dEngineer
Copy link

g0dEngineer commented May 9, 2023

Probably using the wrong calls based on mismatched models.

~So with respect to groovy 1.3 bin try:

from pygpt4all import GPT4All_J

model = GPT4All_J('same path where python code is located/to/ggml-gpt4all-j-v1.3-groovy.bin')

~Or with respect to converted bin try:

from pygpt4all.models.gpt4all import GPT4All

AI_MODEL = GPT4All('same path where python code is located/gpt4all-converted.bin') 

https://github.com/nomic-ai/pygpt4all
Both of the above loaded successfully for me on Windows.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants