We had programs crash due to using too much memory. This is with HDF5 1.8.9.
In most cases, we have no problem, but sometimes the following happens with large files:
In this example, I have a 325 MB HDF5 file that allows you to use 2 GB of memory for reading in some of its values ("timesteps" of the data points in the HDF5 file, a total of 400,001 double-precision values). Apparently, the reason for this is our use of the H5Dread method. Any ideas what we're doing wrong here?
The method causing the problem is as follows:
std::vector<double> Hdf5DataReader::GetUnlimitedDimensionValues() { // Define hyperslab in the dataset hid_t time_dataspace = H5Dget_space(mUnlimitedDatasetId); // Get the dataset/dataspace dimensions hsize_t num_timesteps; H5Sget_simple_extent_dims(time_dataspace, &num_timesteps, NULL); // Data buffer to return std::cout << "Number of timesteps we are reserving memory for = " << num_timesteps << "\n"; std::vector<double> ret(num_timesteps); PrintMemoryUsage("made memory space"); // Read data from hyperslab in the file into the hyperslab in memory H5Dread(mUnlimitedDatasetId, H5T_NATIVE_DOUBLE, H5S_ALL, H5S_ALL, H5P_DEFAULT, &ret[0]); PrintMemoryUsage("read into memory space"); H5Sclose(time_dataspace); return ret; }
and the way out is
Number of timesteps we are reserving memory for = 400001 made memory space: memory use = 43.5898 MB. read into memory space: memory use = 2182.4 MB.
(Using this code to determine the amount of memory allocated for a program - does this look reasonable ?:
#include <unistd.h> #include <sys/resource.h> void PrintMemoryUsage(const std::string& rPrefix) { struct rusage rusage; getrusage( RUSAGE_SELF, &rusage ); double max_res = (double)(rusage.ru_maxrss)/(1024);// Convert KB to MB std::cout << rPrefix << ": memory use = " << max_res << " MB.\n"; }
)
source share