Sharing files between multiple processes [Perl]

I have an application that updates a CSV file (one), the CSV is updated randomly from several processes, and I think if two processes try to update it (add a line ...) at the same time, I think some data will be lost or overwritten (?).

What is the best way to avoid this?

thanks,

+4
source share
2 answers

Use Perl DBI with DBD :: CSV driver to access your data; who will take care of flock for you. (If you are not using Windows 95 or the old Mac OS.) If you decide to upgrade to RDBMS later, you will be well prepared.

The simple flock ing suggested by @Fluff should also be great, of course.

+6
source

If you want to have a simple and manual way to take care of file locking.

 1) As soon as a process opens the csv, it creates a lock. (Lock can be in the form of creating a dummy file. The process has to delete the file(lock) as soon as it is done reading/updating the csv) 2) Have each process check for file lock before trying to update the csv. (If dummy file is present, some process is accessing the csv, else it can update the csv) 
0
source

Source: https://habr.com/ru/post/1386818/


All Articles