Cython Memoryviews - Seg Fault on Large Arrays?

Observing strange behavior even for very small simple integer arrays.

%%cython import numpy as np cimport cython cimport numpy as np def hi(): DEF MAX = 10000000 cdef int a[MAX],i cdef int[:] a_mv = a 

This leads to crashes, but views on smaller views do mine. This is not an obvious memory problem, as there is enough RAM for 10 million ints ...

+4
source share
1 answer

As Kevin notes in his comment, the problem is not RAM, but the stack. You allocate an array of 10 million items on the stack when you really need to allocate it on the heap using friends of malloc et. Even in C, this causes a segmentation error:

  /* bigarray.c */ int main(void) { int array[10000000]; array[5000000] = 1; /* Force linux to allocate memory.*/ return 0; } $ gcc -O0 bigarray.c #-O0 to prevent optimizations by the compiler $ ./a.out Segmentation fault (core dumped) 

While:

 /* bigarray2.c */ #include <stdlib.h> int main(void) { int *array; array = malloc(10000000 * sizeof(int)); array[5000000] = 1; return 0; } $ gcc -O0 bigarray2.c $ ./a.out $ echo $? 0 
+7
source

Source: https://habr.com/ru/post/1485306/


All Articles