This repository has been archived by the owner on Nov 3, 2023. It is now read-only.
Replies: 1 comment
-
Your error is not an error with the bart model import, but rather is an issue with your install. What version of pytorch are you using? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I tried to run blenderbot 2.0 on colab by '"parlai interactive -mf zoo:blenderbot2/blenderbot2_400M/model --rag_retriever_type observation_echo_retriever --knowledge_access_method memory_only " (I don't want internet search) but failed. I always got the " ModuleNotFoundError: No module named 'parlai.zoo.bart.bart_large" error.
I followed the suggestion from BlenderBot2.0 README by installing fairseq (I tried both installations by pip and from source) but still got the same error. I checked the source code and found there is indeed no class called "bart.bart_large" under the parlai.zoo module. There exists only a class named 'bart', instead of 'bart_large', under the zoo module, though the download function of the 'bart' class downloads weights of the bart large model.
When the code was trying to load 'parlai.zoo.bart.bart_large' at the line "my_module = importlib.import_module(module_name)" from build_data.modelzoo_path, I am wondering, if it actually should have passed 'parlai.zoo.bart.bart', instead of 'parlai.zoo.bart.bart_large' to importlib.import_module ? Is the naming inconsistency the root cause of the error?
The full stack trace is as below:
2023-02-24 04:28:57.740849: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F AVX512_VNNI FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-02-24 04:28:57.905752: I tensorflow/core/util/port.cc:104] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable
TF_ENABLE_ONEDNN_OPTS=0
.2023-02-24 04:28:58.617934: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/lib64-nvidia
2023-02-24 04:28:58.618050: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/lib64-nvidia
2023-02-24 04:28:58.618071: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
04:29:09 | building data: /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/blenderbot2_400M/model.tgz
04:29:09 | Downloading http://parl.ai/downloads/_models/blenderbot2/blenderbot2_400M/model.tgz to /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/blenderbot2_400M/model.tgz
Downloading model.tgz: 100% 2.42G/2.42G [00:40<00:00, 59.3MB/s]
04:30:37 | Overriding opt["model_file"] to /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/blenderbot2_400M/model (previously: /checkpoint/kshuster/projects/knowledge_bot/kbot_memfix_sweep25_Fri_Jul__9/338/model.oss)
04:30:37 | Using CUDA
04:30:37 | loading dictionary from /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/blenderbot2_400M/model.dict
04:30:37 | num words = 50264
04:30:37 | Downloading https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe to /usr/local/lib/python3.8/dist-packages/data/gpt2/vocab.bpe
Downloading vocab.bpe: 0.00B [00:00, ?B/s]
04:30:38 | Downloading https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json to /usr/local/lib/python3.8/dist-packages/data/gpt2/encoder.json
Downloading encoder.json: 0.00B [00:00, ?B/s]
04:30:39 | BlenderBot2Fid: full interactive mode on.
Downloading (…)okenizer_config.json: 100% 28.0/28.0 [00:00<00:00, 4.46kB/s]
Downloading (…)solve/main/vocab.txt: 100% 232k/232k [00:00<00:00, 2.00MB/s]
Downloading (…)/main/tokenizer.json: 100% 466k/466k [00:00<00:00, 3.16MB/s]
Downloading (…)lve/main/config.json: 100% 570/570 [00:00<00:00, 218kB/s]
04:30:42 | building data: /usr/local/lib/python3.8/dist-packages/data/models/hallucination/bart_rag_token/model.tgz
04:30:42 | Downloading http://parl.ai/downloads/_models/hallucination/bart_rag_token/model.tgz to /usr/local/lib/python3.8/dist-packages/data/models/hallucination/bart_rag_token/model.tgz
Downloading model.tgz: 100% 950M/950M [00:16<00:00, 59.0MB/s]
04:31:17 | Creating the search engine retriever.
04:31:17 | No protocol provided, using "http://"
04:31:25 | building data: /usr/local/lib/python3.8/dist-packages/data/models/hallucination/multiset_dpr/hf_bert_base.cp
04:31:25 | Downloading https://dl.fbaipublicfiles.com/dpr/checkpoint/retriver/multiset/hf_bert_base.cp to /usr/local/lib/python3.8/dist-packages/data/models/hallucination/multiset_dpr/hf_bert_base.cp
Downloading hf_bert_base.cp: 100% 876M/876M [00:14<00:00, 60.1MB/s]
Downloading (…)"pytorch_model.bin";: 100% 440M/440M [00:06<00:00, 68.7MB/s]
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.seq_relationship.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.bias', 'cls.seq_relationship.weight', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias']
04:31:50 | building data: /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/query_generator/model.tgz
04:31:50 | Downloading http://parl.ai/downloads/_models/blenderbot2/query_generator/model.tgz to /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/query_generator/model.tgz
Downloading model.tgz: 100% 750M/750M [00:13<00:00, 57.1MB/s]
04:32:15 | Building Query Generator from file: /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/query_generator/model
04:32:22 | building data: /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/memory_decoder/model.tgz
04:32:22 | Downloading http://parl.ai/downloads/_models/blenderbot2/memory_decoder/model.tgz to /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/memory_decoder/model.tgz
Downloading model.tgz: 100% 750M/750M [00:12<00:00, 58.3MB/s]
04:32:47 | Building Memory Decoder from file: /usr/local/lib/python3.8/dist-packages/data/models/blenderbot2/memory_decoder/model
[downloading BART models: /usr/local/lib/python3.8/dist-packages/data/models/bart]
Downloading bart.large.tar.gz: 100% 3.70G/3.70G [01:22<00:00, 44.9MB/s]
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/parlai/core/build_data.py", line 511, in modelzoo_path
my_module = importlib.import_module(module_name)
File "/usr/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 973, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'parlai.zoo.bart.bart_large'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/parlai", line 8, in
sys.exit(main())
File "/usr/local/lib/python3.8/dist-packages/parlai/main.py", line 14, in main
superscript_main()
File "/usr/local/lib/python3.8/dist-packages/parlai/core/script.py", line 325, in superscript_main
return SCRIPT_REGISTRY[cmd].klass._run_from_parser_and_opt(opt, parser)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/script.py", line 108, in _run_from_parser_and_opt
return script.run()
File "/usr/local/lib/python3.8/dist-packages/parlai/scripts/interactive.py", line 118, in run
return interactive(self.opt)
File "/usr/local/lib/python3.8/dist-packages/parlai/scripts/interactive.py", line 84, in interactive
agent = create_agent(opt, requireModelExists=True)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/agents.py", line 468, in create_agent
model = create_agent_from_opt_file(opt)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/agents.py", line 421, in create_agent_from_opt_file
return model_class(opt_from_file)
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/rag/rag.py", line 186, in init
self._generation_agent.init(self, opt, shared) # type: ignore
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/bart/bart.py", line 72, in init
super().init(opt, shared)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_generator_agent.py", line 501, in init
self.model = fsdp_utils.fsdp_wrap(self.build_model())
File "/usr/local/lib/python3.8/dist-packages/projects/blenderbot2/agents/blenderbot2.py", line 969, in build_model
model = BlenderBot2FidModel(self.opt, self.dict)
File "/usr/local/lib/python3.8/dist-packages/projects/blenderbot2/agents/modules.py", line 840, in init
super().init(
File "/usr/local/lib/python3.8/dist-packages/projects/blenderbot2/agents/modules.py", line 98, in init
self.memory_decoder = MemoryDecoder(opt)
File "/usr/local/lib/python3.8/dist-packages/projects/blenderbot2/agents/sub_modules.py", line 296, in init
base_agent = create_agent_from_model_file(
File "/usr/local/lib/python3.8/dist-packages/parlai/core/agents.py", line 347, in create_agent_from_model_file
return create_agent_from_opt_file(opt)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/agents.py", line 421, in create_agent_from_opt_file
return model_class(opt_from_file)
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/bart/bart.py", line 71, in init
opt = self._initialize_bart(opt)
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/bart/bart.py", line 97, in _initialize_bart
compare_init_model_opts(opt, opt)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/agents.py", line 273, in compare_init_model_opts
opt['init_model'] = modelzoo_path(opt['datapath'], opt['init_model'])
File "/usr/local/lib/python3.8/dist-packages/parlai/core/build_data.py", line 519, in modelzoo_path
my_module.download(datapath)
File "/usr/local/lib/python3.8/dist-packages/parlai/zoo/bart/build.py", line 69, in download
ConversionScript.main(**args)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/script.py", line 127, in main
return cls._run_kwargs(kwargs)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/script.py", line 92, in _run_kwargs
return cls._run_from_parser_and_opt(opt, parser)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/script.py", line 108, in _run_from_parser_and_opt
return script.run()
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/bart/convert_fairseq_to_parlai.py", line 140, in run
self.print_agent_act()
File "/usr/local/lib/python3.8/dist-packages/parlai/agents/bart/convert_fairseq_to_parlai.py", line 490, in print_agent_act
print(self.agent.act())
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_agent.py", line 2148, in act
response = self.batch_act([self.observation])[0]
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_agent.py", line 2244, in batch_act
output = self.eval_step(batch)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_generator_agent.py", line 901, in eval_step
beam_preds_scores, beams = self._generate(
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_generator_agent.py", line 1223, in _generate
b.advance(score[i], _ts)
File "/usr/local/lib/python3.8/dist-packages/parlai/core/torch_generator_agent.py", line 1599, in advance
self.partial_hyps[path_selection.hypothesis_ids.long()],
RuntimeError: indices should be either on cpu or on the same device as the indexed tensor (cpu)
Thank you very much again!
Beta Was this translation helpful? Give feedback.
All reactions