I have a program that I wrote on my Mac, but I can’t start on my Raspberry Pi due to lack of RAM (MemoryError).
The essence of the program is some image processing, where it will roll up 640x480 uint8 with a complex of 128 twice as much.
I believe that memory usage: Initial image:
640 x 480 x 8 bits / 8 bits / 1024 bytes = 300 kb
Complex matrix:
640 x 480 x 2^2 x 128 bits / 8 bits / 1024^2 = 18.75 MB
Suppose that it should contain perhaps two or three copies of these different matrices in memory - this should be a fairly small size - perhaps 100 MB. Unfortunately, it seems to be exhausting the full available 330 MB (the Python runtime should also load into this space).
- Is my analysis correct?
- Are any Python memory management tips any better?
UPDATE:
, , fftconvolve, :
#
65 86.121 MiB 0.000 MiB @profile
66 def iriscode(self):
67 86.121 MiB 0.000 MiB img = self.polar
68
69 86.379 MiB 0.258 MiB pupil_curve = find_max(img[0:40])
70 86.379 MiB 0.000 MiB blur = cv2.GaussianBlur(self.polar, (9, 9), 0)
71 76.137 MiB -10.242 MiB iris_fft = fit_iris_fft(radial_diff(blur[50:,:])) + 50
72
73 76.160 MiB 0.023 MiB img = warp(img, iris_fft, pupil_curve)
74 # cv2.imshow("mask",np.uint8(ma.getmaskarray(img))*255)
75
76 global GABOR_FILTER
77 262.898 MiB 186.738 MiB output = signal.fftconvolve(GABOR_FILTER, img, mode="valid")
, . , , ? complex64 complex128, .