Perhaps the best solution would be to use the DEFLATE library and run it on large data blocks and with high compression settings.
If you want to collapse your own stream compression algorithm, you can apply the same algorithm that works for audio files: first send the first dimension, then encode the difference between each sample and the previous one (delta coding).
Now the best encoding is different in how fast the data changes:
If the data changes quickly, use the adaptive Huffman tree. If the differences are not correlated (data + noise), this will allow you to get no more than one bit per sample from entropy.
If several consecutive data samples can be equal to each other (data does not change very quickly and there is no HF noise), then encode each nonzero difference using one Huffman tree and the number of zeros using the second Huffman tree. This will allow you to get a maximum of two bits in one pass.
You can even encode differences in just one bit (up or down), but then you should be able to encode null runs.
My suggestion is: delta-encode once or twice to get uncorrelated records, then DEFLATE using the library.
source share