What problem are you experiencing, and what does “big” mean to you? I have friends, you need to load 200 GB of files into memory, so their idea of good advice is much different from a budget buyer for minimal VM fragments suffering from 250 MB of RAM (indeed, my phone has more).
In general, Perl holds onto any memory you use, even if it doesn't use it. Understand that optimization is one way, for example. memory can adversely affect another, such as speed.
This is not an exhaustive list (and there is more in Perl Programming ):
☹ Use Perl's memory profiling tools to help you find problem areas. See Using memory heap memory in perl programs and How to find the amount of physical memory used by a hash in Perl?
☹ Use the smallest lexical variables to allow Perl to reuse this memory when you don't need it.
☹ Avoid creating large time frames. For example, reading a file using foreach reads all the input at the same time. If you only need this one at a time, use while .
foreach ( <FILE> ) { ... } # list context, all at once while( <FILE> ) { ... } # scalar context, line by line
☹ You may not even need a file in memory. Memory card files instead of marking them up
☹ If you need to create large data structures, consider something like DBM :: Deep or other storage systems to save most of them RAM and to disk until you need it.
☹ Do not let people use your program. Whenever I did this, I reduced the memory by about 100%. It also reduces support requests.
☹ Pass large chunks of text and large aggregates via the link so that you do not make a copy, thereby saving the same information twice. If you need to copy it because you want to change something, you can get stuck. This happens in both directions as subroutine arguments and subroutine return values:
call_some_sub( \$big_text, \@long_array ); sub call_some_sub { my( $text_ref, $array_ref ) = @_; ... return \%hash; }
☹ Track memory leaks in modules. I had big problems with the application until I realized that the module was not releasing memory . I found the fix in the RT module queue, applied it and solved the problem.
☹ If you need to process a large piece of data once, but do not need to keep a constant amount of memory, unload the work to a child process. A child process has only a memory area at runtime. When you receive a response, the child process exits and frees memory. Similarly, work distribution systems such as Gearman can be distributed between machines.
☹ Turn recursive solutions into iterative ones. Perl does not have tail recursion optimization, so every new call is added to the call stack. You can optimize the tail problem yourself with tricks using goto or a module, but it takes a lot of work to hang on a technique that you probably don't need.
☹ Did he use 6 GB or only five? Well, to be honest, in all this excitement, I kind of lost myself. But, since it is Perl, the most powerful language in the world, and would take away your memory, you have to ask yourself one question: “I’m lucky? Well, yes, punk?
There is still much, but early in the morning to find out what it is. I highlight some of Mastering Perl and Effective Perl Programming .