Numpy Memory Usage

I have a program that I wrote on my Mac, but I can’t start on my Raspberry Pi due to lack of RAM (MemoryError).

The essence of the program is some image processing, where it will roll up 640x480 uint8 with a complex of 128 twice as much.

I believe that memory usage: Initial image:

640 x 480 x 8 bits / 8 bits / 1024 bytes = 300 kb

Complex matrix:

640 x 480 x 2^2 x 128 bits / 8 bits / 1024^2 = 18.75 MB

Suppose that it should contain perhaps two or three copies of these different matrices in memory - this should be a fairly small size - perhaps 100 MB. Unfortunately, it seems to be exhausting the full available 330 MB (the Python runtime should also load into this space).

  • Is my analysis correct?
  • Are any Python memory management tips any better?

UPDATE:

, , fftconvolve, :

#

65   86.121 MiB    0.000 MiB     @profile
66                               def iriscode(self):
67   86.121 MiB    0.000 MiB       img = self.polar
68
69   86.379 MiB    0.258 MiB       pupil_curve = find_max(img[0:40])
70   86.379 MiB    0.000 MiB       blur = cv2.GaussianBlur(self.polar, (9, 9), 0)
71   76.137 MiB  -10.242 MiB       iris_fft = fit_iris_fft(radial_diff(blur[50:,:])) + 50
72
73   76.160 MiB    0.023 MiB       img = warp(img, iris_fft, pupil_curve)
74                                 # cv2.imshow("mask",np.uint8(ma.getmaskarray(img))*255)
75
76                                 global GABOR_FILTER
77  262.898 MiB  186.738 MiB       output = signal.fftconvolve(GABOR_FILTER, img, mode="valid")

, . , , ? complex64 complex128, .

+4
2

, , fftconvolve .

, . FFT, , , .. ​​ . , (1280+640-1, 960+480-1) = (1919, 1439). , , 2, 3 5 , (1920, 1440). , 1920 * 1440 * 16 / 2**20 = 42 MiB.

2 , , , , , , , , .

, , , , - , , 3 , , 4. FFT, 186 MiB.

, , . scipy.signal.fftconvolve. else :

else:
    ret = fftn(in1, fshape)
    ret *= fftn(in2, fshape)
    ret = ifftn(ret)[fslice].copy()

40 MiB, .

+4

:

  • fftconvolve.
  • scipy.weave ( numpy).
+1

Source: https://habr.com/ru/post/1544013/


All Articles