I have a Python script that checks the pickup directory and processes all found files and then deletes them.
How can I make sure I donโt take a file that is still being written by a process that drops files in this directory?
My test case is pretty simple. I copy 300 MB of files to the pickup directory, and often the script will grab the file that is still being written. It only works with a partial file and then deletes it. This leads to an error when working with a file in the OS, since the file that it was writing disappeared.
I tried to acquire a file lock (using the FileLock module) before opening / processing / deleting it. But it did not help.
I reviewed checking file modification time to avoid anything for X seconds. But that seems awkward.
My test is on OSX, but I'm trying to find a solution that will work on major platforms.
I see a similar question here ( How to check if a file is saved? ), But there was no clear solution.
thanks
source share