-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unsupported Torch Dynamo Operation #48
Comments
UPDATEfor now, avoiding compiling
|
User Set torch._dynamo.config.verbose=True or TORCHDYNAMO_VERBOSE=1 for more information You can suppress this exception and fall back to eager by setting: """ The above exception was the direct cause of the following exception: Traceback (most recent call last): Still getting this issue , ,I have already changed the class llama for the generate_one_token function |
Steps to Reproduce:
NOTE: it is necessary to set max_len_seq = 512 or get Index Range error
Error Message
My current suspicions on issue cause:
help from those with deeper knowledge of codebase would be appreciated @alanwaketan, @ManfeiBai, @miladm
The text was updated successfully, but these errors were encountered: