Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What does this issue mean? #227

Open
emilyhanyf opened this issue Jun 8, 2024 · 0 comments
Open

What does this issue mean? #227

emilyhanyf opened this issue Jun 8, 2024 · 0 comments

Comments

@emilyhanyf
Copy link

TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'cache_position' -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! OCRing with base model failed on /lfs/skampere1/0/naveenkc/contest_scraper/test_nougat/731991.pdf... trying small model /lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/torch/functional.py:512: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:3587.) return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined] 0%| | 0/288 [00:24<?, ?it/s] Traceback (most recent call last): File "/lfs/skampere1/0/naveenkc/miniconda/bin/nougat", line 8, in <module> sys.exit(main()) ^^^^^^ File "/lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/predict.py", line 167, in main model_output = model.inference( ^^^^^^^^^^^^^^^^ File "/lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/nougat/model.py", line 592, in inference decoder_output = self.decoder.model.generate( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/transformers/generation/utils.py", line 1758, in generate result = self._sample( ^^^^^^^^^^^^^ File "/lfs/skampere1/0/naveenkc/miniconda/lib/python3.12/site-packages/transformers/generation/utils.py", line 2394, in _sample model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'cache_position' -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! -> Cannot close object, library is destroyed. This may cause a memory leak! OCRing with small model failed on /lfs/skampere1/0/naveenkc/contest_scraper/test_nougat/731991.pdf The following files failed OCRing and need manual review: /lfs/skampere1/0/naveenkc/contest_scraper/test_nougat/731991.pdf Time taken: 65.70022702217102 seconds, 1.0950037837028503 minutes, 0.018250063061714172 hours

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant