I would like to know if hdf5 is suitable for real-time data logging or not?
More precisely: I am working on a project in which we want constantly (sampling frequency in the range from 30 to 400 Hz), mixes quite a lot of data (several hours) of various nature (telemetry, signals, video).
Data should be recorded in real time (or with a slight delay), so that we do not lose them in case of a possible failure.
Our first prototype is based on sqlite3, however, we believe that some restrictions may increase due to prolonged use: speed, one database == one file and difficulties accessing the database from multiple threads (blocking exception when reading and writing to same time).
So, I am considering using hdf5 as a back-end for storing data on disk (and numpy / pytable for internal representation). Do you think it is possible to regularly update hdf5 file from such python binding?
source share