I wrote a small cryptographic module in python whose task is to encrypt the file and put the result in a tarfile. The source file for encryption may be terminated, but this is not a problem, because my program should only work with a small block of data at a time, which can be encrypted on the fly and saved.
I am looking for a way to avoid this in two passes, first writing all the data to a temporary file, and then pasting the result into the tarfile.
I basically do the following (where generator_encryptor is a simple generator that gives chunks of data read from the source file).
t = tarfile.open("target.tar", "w")
tmp = file('content', 'wb')
for chunk in generator_encryptor("sourcefile"):
tmp.write(chunks)
tmp.close()
t.add(content)
t.close()
I'm a little annoyed that I should use a temporary file as a file, which should be easy to write blocks directly to the tar file, but collect all the pieces in one line and use something like t.addfile ('content', StringIO (bigcipheredstring) seems excluded because I cannot guarantee that I have enough memory for the old bigcipheredstring.
Any hint on how to do this?
source
share