Why does the cv2 extension not affect my image?

So, I generate a binary (well, really gray, 8 bit used as binary) image with python and opencv2, writing a small number of polygons to the image, and then expanding the image using the kernel. However, the source and destination images always end up the same, no matter which kernel I use. Any thoughts?

from matplotlib import pyplot import numpy as np import cv2 binary_image = np.zeros(image.shape,dtype='int8') for rect in list_of_rectangles: cv2.fillConvexPoly(binary_image, np.array(rect), 255) kernel = np.ones((11,11),'int') dilated = cv2.dilate(binary_image,kernel) if np.array_equal(dilated, binary_image): print("EPIC FAIL!!") else: print("eureka!!") 

All I get is EPIC FAIL !

Thanks!

+6
source share
1 answer

So, it turns out the problem was to create both the kernel and the image. I believe openCV expects a 'uint8' as a data type for both the kernel and the image. In this particular case, I created a kernel with dtype='int' , which defaults to 'int64' . In addition, I created the image as 'int8' , not 'uint8' . Somehow this did not raise an exception, but it caused unexpected expansion failures.

Change the two lines above to

 binary_image = np.zeros(image.shape,dtype='uint8') kernel = np.ones((11,11),'uint8') 

The problem is fixed, and now I get EUREKA ! Hurrah!

+8
source

Source: https://habr.com/ru/post/919370/


All Articles