This question arises when I notice that my script starts to stop and shuts down when it writes contents to a file in a while loop. First, I check my memory and see a trend that this happens when my operating system has decreased to about 2-3%. Then I look at my block of code that consumes RAM, and I figured that as the script continues to feed the contents to a file, it grows in size, which also takes up memory. Here is the code that creates the file:
open (my $fh, '>>', $filePath); while (my $row = $sth->fetchrow_hashref) { print $fh join ('|', $row->{name}, $row->{address}) "\n"; }
If I uncomment the print statement and run the script, my RAM will not decrease. So, I am sure that the RAM memory is occupied by FILEHANDLE / something else behind Perl that takes up memory. This leads me to the question, if there is a way to write a file from a Perl script to disk, instead of using RAM memory?
- I tried flushing FILEHANDLE on each line and it still didn't work.
- One thing, which is also very strange, is that after I finish my script, I look at my memory and still process the files. And when I delete files, it frees up my memory. I use
free on linux to check my memory.

source share