-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix memory leak by calling H5close() in NDFileHDF5::closeFile() #390
Conversation
Last change avoids print the error message if the object count is 1 at the time/place of the check. Assumption is that the file object is closed in the next lines. There is no leak, after this change, as expected.
|
==21315== 4,000 bytes in 1,000 blocks are definitely lost in loss record 6,503 of 6,903 ==21315== at 0x4C31B25: calloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) ==21315== by 0x510CE7E: NDFileHDF5::createAttributeDataset(NDArray*) (NDFileHDF5.cpp:2652) ==21315== by 0x510E7C9: NDFileHDF5::openFile(char const*, int, NDArray*) (NDFileHDF5.cpp:282) ==21315== by 0x50FDEF5: NDPluginFile::openFileBase(int, NDArray*) (NDPluginFile.cpp:73) ..
Resolves #385. Callling H5close eliminates memory leak. Not entirely clear if this is a leak internal to HDF5 library or something we are not closing correctly. |
Exactly. It's could be that we're not closing some hdf5 objects - but the object count at the end is just 1 (the file) so not clear where that memory is leaked... |
Any thoughts on the performance impact this call to H5close() makes? |
I was worrying about calling H5close might have huge impact. But I perceive no changes by looking at the RunTime filed. |
I've added the following lines to NDFileHDF5.cpp, before the call to
I get these lines for each iteration of of the 1000 images generated/saved:
As @ulrikpedersen already suspected:
I think this maybe has been answered. |
Why is After all, the H5close() is considered as 'close the complete HDF5 library and free the resources', supposed to be used at app exit according to the docs (or for explicit library termination). Looking at the docs for H5Fclose():
We must be hitting this case and this might make |
I don't think that's the case: we actually do set the "close degree" to "strong" which means all objects are forced close when the file is closed. See #385 (comment) for links to relevant code & docs. It still leaves a bit of a mystery as to why we are leaking memory in HDF5 (and need to |
Just a reminder that this should be considered a temporary workaround even it causes no noticeable performance penalty. It assumes that only one NDFileHDF5 plugin in the IOC. In case of two plugins, or just another usage of HDF5 library, one of them will fail and eventually crash the IOC.
|
I think the right way to track this problem down is to create a stand-alone program that duplicates what the NDFileHDF5 plugin is doing. We can see if it has a memory leak. If it does then we can contact the HDF5 group to see if there is an issue with their library. |
I agree with the idea - but in practice this is quite difficult to recreate a simple application as the plugin does alot... I can start by sending them an email to explain the issue and ask if they have any clues as to what could leak - or what we could further do to debug. |
Here is a new pull request #397 that removes introduced H5close() and adds H5xclose() on specific objects that were identified as memory leaks. |
fix memory leak by calling H5close() in NDFileHDF5::closeFile()
This a change to fix memory leak in the HDF5 plugin, see #385.
Warning!
It appears that calling
H5close()
afterH5Fclose()
plugs the hole, but the core source of the the leak might still be at large!