-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jep did not free GPU memory when it was closed #540
Comments
Jep does not allocate or free any GPU memory and I do not think this is something we can do. Jep will free any unused python objects when they are no longer references but things like modules and native extensions may not be freed. You may be able to free additional memory by manually clearing |
Is it possible to call a new Process (by multiprocess) in SharedInterpreter? Running tensorflow in Process in pure python did free gpu memory. But I didn't make Process work in Jep. It will throw pickle problem. |
Multiprocessing has worked in the past but I do not think it works well with recent python versions. There may be ways to get it working but that is not something anyone has had time to investigate and document, See #228 for more detailed background. |
We found a problem where python objects with reference cycles are not cleaned up when an interpreter is closed. An additional garbage collection was added in #556 which is included in the 4.2.1 release. It's possible this change would help free GPU memory if the memory is held by a python object in a. reference cycle. Do you see any change in behavior in Jep 4.2.1? |
Describe the bug
Using jep to run tensorflow model on a GPU machine. I suppose it will release GPU memory after jep interpreter was closed. But it was not. When running a single python script using python, the GPU memory will be released when python process finished. But now java was the process who retains the GPU usage. How can I release these GPU memory when Jep was closed but jvm is still running?
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: