FW: Heap Fragmentation in H5Zdeflate? (USER:firstname.lastname@example.org)
From: Hdf-forum [mailto:[hidden email]] On Behalf Of Kochunas, Brendan
Sent: Thursday, January 11, 2018 11:14 AM
To: [hidden email] Subject: [Hdf-forum] Heap Fragmentation in H5Zdeflate?
I have a detailed question about a problem I am encountering using the gzip filter in HDF5.
The problem manifests as a failure to allocate the output buffer in H5Zdeflate (line 135) reporting unavailable resources. I have debugged this quite a bit and have confirmed that the buffer trying to be allocated is only a few MB and there is roughly 60GB of memory still available on the machine at the time the allocation fails. I strongly suspect it is due to heap fragmentation (I don't have a 1MB contiguous block of memory available).
This is only happening on a particular machine, and I understand there's a lot of important subtlety to chunk size and chunk cache size so I'm wondering if anyone has encountered some similar behavior in that heap fragmentation is likely to occur in H5Zdeflate. Also wondering if anyone can provide any guidance on how to resolve this. I was thinking the next thing to try would be to play with options for the chunk cache size
(H5Pset_cache) described here