-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add size to lru_cache #1
Comments
Also getting this error (on Colab ) |
@JosephGatto I also got that error on colab only.
|
@gsasikiran was not able to replicate this on my local, but we could definitely set the maxsize to ensure this error does not occur again. If possible, could you open a PR with the maxsize fix you mentioned, I can merge it right away. |
@AnjanaRita I have considered 'maxsize' to be 'None', as the model is bigger and am not sure, how much max size is required. If you would prefer to push the same, I will. |
Also got this error but then i changed the python version to 3.8 and the problem no longer occur. But i got a different error/problem after searching for topics. It seems that |
@AnjanaRita You should allow my username to push the files. Or you can add the paranthesis and push yourself, along with |
I assume that you have to provide, maxsize parameter to lru_cache. Worked for me, when I provided the parameter.
The text was updated successfully, but these errors were encountered: