I am working on a fairly large enterprise application that uses Perl, has dozens of modules, etc., which are mainly used to circumvent some things over the Internet.
One of the routines that I wrote performs image search and analysis. Usually, to send each parameter, I send a couple of seconds. So I am sending it to another process (forking ...). The problem is that after a while the system becomes very unstable, the memory is full.
Questions
- Is it because every process created creates a copy of the parent data in a separate memory cell? if so, does each child have a copy of ALL modules? (and there are dozens ...)
- What is the best way to free this memory / manage these processes?
source share