Gradient checkpointing/ rematerialization #817
Answered
by
avik-pal
jakubMitura14
asked this question in
Q&A
-
Hello is there a way for gradient checkpointing so instead of saving values of forward pass during backward pass recalculate forward function to reduce memory consumption of lux layer |
Beta Was this translation helpful? Give feedback.
Answered by
avik-pal
Aug 3, 2024
Replies: 1 comment
-
For Zygote you can use https://fluxml.ai/Zygote.jl/stable/utils/#Zygote.checkpointed. I am not sure if Enzyme has one built in (@wsmoses might know?), though mimicking the Zygote one for Enzyme shouldn't be hard. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
jakubMitura14
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
For Zygote you can use https://fluxml.ai/Zygote.jl/stable/utils/#Zygote.checkpointed.
I am not sure if Enzyme has one built in (@wsmoses might know?), though mimicking the Zygote one for Enzyme shouldn't be hard.