I am looking for an algorithm that will allow me to represent the incoming sequence of bits in the form of letters ('a' .. 'z') in minimal matter, so that the bit stream can be regenerated from letters, the entire sequence in memory.
That is, given the external source of the bit (each reading returns an almost random bit) and user input of several bits, I would like to print the minimum number of characters that these bits can represent.
Ideally, there should be parameterization — how much memory and maximum bits before some waste is needed.
The goal of efficiency is the same number of characters as the representation of bit-26 in base 26.
Solutions:
If enough storage was present, save the entire sequence and use the operation with a large number of MOD 26.
Convert every 9 bits to 2 characters. This seems suboptimal, losing 25% of the information about the output of letters.
Curious
source
share