Perl cannot allocate more than 1.1 GB on a Mac Snow Leopard server with 32 GB of RAM

I have a Mac server (snow leopard) with 32 GB of RAM. When I try to allocate more than 1.1 GB of RAM in Perl (v 5.10.0), I get an error from memory. Here is the script I used:

#!/usr/bin/env perl # My snow leopard MAC server runs out of memory at >1.1 billion bytes. How # can this be when I have 32 GB of memory installed? Allocating up to 4 # billion bytes works on a Dell Win7 12GB RAM machine. # perl -v # This is perl, v5.10.0 built for darwin-thread-multi-2level # (with 2 registered patches, see perl -V for more detail) use strict; use warnings; my $s; print "Trying 1.1 GB..."; $s = "a" x 1100000000; # ok print "OK\n\n"; print "Trying 1.2 GB..."; $s = ''; $s = "a" x 1200000000; # fails print "..OK\n"; 

Here is the result I get:

 Trying 1.1 GB...OK perl(96685) malloc: *** mmap(size=1200001024) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug Out of memory! Trying 1.2 GB... 

Any ideas why this is happening?


UPDATE 4:42 pm 11/14/13

According to Kent Fredrick (see 2 posts below), here are my ulimits. By default, virtual memory is unlimited.

  $ ulimit -a |  grep bytes
 data seg size (kbytes, -d) unlimited
 max locked memory (kbytes, -l) unlimited
 max memory size (kbytes, -m) unlimited
 pipe size (512 bytes, -p) 1
 stack size (kbytes, -s) 8192
 virtual memory (kbytes, -v) unlimited

 $ perl -E 'my $ x = "a" x 1200000000;  print "ok \ n" '
 perl (23074) malloc: *** mmap (size = 1200001024) failed (error code = 12)
 *** error: can't allocate region
 *** set a breakpoint in malloc_error_break to debug
 Out of memory!

 $ perl -E 'my $ x = "a" x 1100000000;  print "ok \ n" '
 ok

I tried to set virtual memory to 10 billion, but to no avail.

  $ ulimit -v 1,000,000,000 # 10 billion

 $ perl -E 'my $ x = "a" x 1200000000;  print "ok \ n" '
 perl (24275) malloc: *** mmap (size = 1200001024) failed (error code = 12)
 *** error: can't allocate region
 *** set a breakpoint in malloc_error_break to debug
 Out of memory!
+4
source share
4 answers

I think I figured it out. I could not agree that Apple sent 32-bit Perl when their documentation says otherwise. From "man perl":

 64-BIT SUPPORT Version 5.10.0 supports 64-bit execution (which is on by default). Version 5.8.8 only supports 32-bit execution. 

Then I remembered that I installed Fink on my Mac server, and that was, uh, with the fatigue of 32 and 64-bit problems. So I commented

 #test -r /sw/bin/init.sh && . /sw/bin/init.sh 

from my .profile . Now I can at least allocate 14 GB of RAM (yes!) On my server with 32 GB of RAM

 $ perl -E 'my $x = "a" x 14000000000; print "ok\n"' ok 

I tried 16 GB, but it hung for 5 minutes before I gave up. Now the diff between perl -V for 32-bit and 64-bit commands tells the story (but why else intsize=4 ?).

 $ diff perlv.32 perlv.64 16c16 < intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234 --- > intsize=4, longsize=8, ptrsize=8, doublesize=8, byteorder=12345678 18c18 < ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8 --- > ivtype='long', ivsize=8, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8 34,35c34,36 < PERL_IMPLICIT_CONTEXT PERL_MALLOC_WRAP USE_ITHREADS < USE_LARGE_FILES USE_PERLIO USE_REENTRANT_API --- > PERL_IMPLICIT_CONTEXT PERL_MALLOC_WRAP USE_64_BIT_ALL > USE_64_BIT_INT USE_ITHREADS USE_LARGE_FILES > USE_PERLIO USE_REENTRANT_API 

Thank you for your help,

Floor

+1
source

You are using a 32-bit Perl assembly (as perl -V:ptrsize ), but you need a 64-bit assembly. I recommend setting local perl using perlbrew .

This can be achieved by passing -Duse64bitall to Configure when installing Perl.

This can be achieved by passing --64all to perlbrew install when installing Perl.

(For some odd reason, perl -V:use64bitall says it was done, but it clearly wasn't.)

+5
source

It looks like this may be due to a problem. It is only truly worthy of comment, but it is too difficult to put as a whole without its illegibility.

 perlbrew exec --with=5.10.0 memusage perl -e '$x = q[a] x 1_000_000_000; print length($x)' 5.10.0 ========== 1000000000 Memory usage summary: heap total: 2000150514, heap peak: 2000141265, stack peak: 4896 

Yes, this is 2 G memory for 1 G text.

Now using 2G ...

 perlbrew exec --with=5.10.0 memusage perl -e '$x = q[a] x 1_000_000_000; $y = q[a] x 1_000_000_000; print length($x)+length($y)' 5.10.0 ========== 2000000000 Memory usage summary: heap total: 4000151605, heap peak: 4000142092, stack peak: 4896 

Clap. That would certainly fall into the 32-bit limit if you had one.

I was spoiled and did my testing on 5.19.5 , which has a notable improvement, named copy-to-write strings, which significantly reduces memory consumption:

 perlbrew exec --with=5.19.5 memusage perl -e '$x = q[a] x 1_000_000_000; $y = q[a] x 1_000_000_000; print length($x)+length($y)' 5.19.5 ========== 2000000000 Memory usage summary: heap total: 2000157713, heap peak: 2000150396, stack peak: 5392 

So, anyway, if you are using any version of Perl other than development, you need to expect it to consume twice as much memory you need.

If for some reason there is a memory limit around a 2G window for 32-bit processes, you will get this using the 1G line.

Why copy to write?

Ok when you do

 $a = $b 

$a is a copy of $b

So when you do

 $a = "a" x 1_000_000_000 

First, it extends the right side, creating a variable, and then creates a copy to store in $a .

You can prove this by excluding the copy as follows:

 perlbrew exec --with=5.10.0 memusage perl -e 'print length(q[a] x 1_000_000_000)' 5.10.0 ========== 1000000000 Memory usage summary: heap total: 1000150047, heap peak: 1000140886, stack peak: 4896 

See, everything I did was deleted by an intermediate variable, and memory usage was halved!

: S

Although, since 5.19.5 only refers to the original string and copies it when writing, it is effective by default, so deleting an intermediate variable has slight advantages

 perlbrew exec --with=5.19.5 memusage perl -e 'print length(q[a] x 1_000_000_000)' 5.19.5 ========== 1000000000 Memory usage summary: heap total: 1000154123, heap peak: 1000145146, stack peak: 5392 
+3
source

It could also be a Mac limit on the memory for each process to prevent processes consuming too much system memory.

I do not know how this is possible, but I believe that a Mac, which is Unix, has unix-like ulimits:

There are several such memory limitations, some excerpts from /etc/security/limits.conf

 - core - limits the core file size (KB) - data - max data size (KB) - fsize - maximum filesize (KB) - memlock - max locked-in-memory address space (KB) - rss - max resident set size (KB) - stack - max stack size (KB) - as - address space limit (KB) 

bash provides ways to limit and read these (several) info bash --index-search=ulimit

For example, ulimit -a | grep bytes ulimit -a | grep bytes emits this on my linux machine:

 data seg size (kbytes, -d) unlimited max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 stack size (kbytes, -s) 8192 virtual memory (kbytes, -v) unlimited 

And I can arbitrarily limit this within the scope of:

 $ perl -E 'my $x = "a" x 100000000;print "ok\n"' ok $ ulimit -v 200000 $ perl -E 'my $x = "a" x 100000000;print "ok\n"' Out of memory! panic: fold_constants JMPENV_PUSH returned 2 at -e line 1. 

So ulimits certainly has something to see.

+1
source

Source: https://habr.com/ru/post/978831/


All Articles