Perl and process management

I am working on a fairly large enterprise application that uses Perl, has dozens of modules, etc., which are mainly used to circumvent some things over the Internet.

One of the routines that I wrote performs image search and analysis. Usually, to send each parameter, I send a couple of seconds. So I am sending it to another process (forking ...). The problem is that after a while the system becomes very unstable, the memory is full.

Questions

  • Is it because every process created creates a copy of the parent data in a separate memory cell? if so, does each child have a copy of ALL modules? (and there are dozens ...)
  • What is the best way to free this memory / manage these processes?
+4
source share
1 answer

Forking uses copy-on-write , so forked processes should not take up too much memory if they are not particularly durable.

It looks like you have a memory leak in your system. Do you have cross-recursive dependencies or circular data structures? If so, you might want to examine Scalar :: Util :: relax to adjust the number of references to your data structures.

+6
source

Source: https://habr.com/ru/post/1383302/


All Articles