Tracking children from old runs script

I have a Perl script that spawns some children. They all take quite some time to run, creating directories and files along the way. I often notice things that I would like to change before the children die of natural causes, so I need to close everything (which is associated with several grep and kill calls) and delete the files created by the children. This is not very much, but a little pain in the neck. I would like to create a setting in which all children will be tracked, so when I start the parent again, the old children are still running, they are reported.

My best idea is still storing a log file with the children that I ran with their identifiers, checking it and updating it at the beginning of the parent script. Kill and report that all children are still working, and will die to allow the user to clean directories and files manually before a new run.

So my question is this: how can I add add children to the log file? Is there a way to set up a trigger that could take care of this automatically, or am I stuck to remember to do this anywhere in the code where the new process starts?

ps Of course, I am open to suggestions on the best ways to achieve this!

0
source share
1 answer

What about signal handlers ? The parent can track the values โ€‹โ€‹of the jobs that he started (and, if you want, the pids that were received using the SIGCHLD handler). When you want to terminate prematurely, tell your parents to kill all the children and remove them after them. If the child processes are also Perl scripts, you can put signal handlers in the child elements and see if they are cleared after themselves.

0
source

Source: https://habr.com/ru/post/1343250/


All Articles