Problem
I have a circular boolean mask of arbitrary radius (always perfectly symmetrical):
array([[False, False, True, False, False],
[False, True, True, True, False],
[ True, True, True, True, True],
[False, True, True, True, False],
[False, False, True, False, False]], dtype=bool)
Then I have a big matrix of uint8, imageand a pair in the matrix, which can be any valid index point.
What I want to do is apply this mask to this point in the image so that I can essentially just put the circle at that point in image.
This is pretty easy to do in the middle of the image. You can simply:
image[point[0] - radius:point[0] + radius + 1, point[1] - radius:point[1] + radius + 1] = circle_mask
But naturally, this does not handle border validation, which in this case seems rather complicated, because I have to make sure that the range imagethat we assign is the same size as the assigned mask.
Example
point - (1, 1), 2, , , image , :
array([[ 1., 1., 1., 0., 0., ... , 0.],
[ 1., 1., 1., 1., 0., ... , 0.],
[ 1., 1., 1., 0., 0., ... , 0.],
[ 0., 1., 0., 0., 0., ... , 0.],
[ 0., 0., 0., 0., 0., ... , 0.],
...,
[ 0., 0., 0., 0., 0., ... , 0.]])
, , :
point = (1, 1)
radius = 2
image = np.zeros((10, 10))
x, y = np.ogrid[-radius : radius + 1, -radius : radius + 1]
circle_mask = x**2 + y**2 <= radius**2
image_min_row = max(point[0] - radius, 0)
image_min_col = max(point[1] - radius, 0)
image_max_row = min(point[0] + radius + 1, image.shape[0])
image_max_col = min(point[1] + radius + 1, image.shape[1])
mask_min_row = max(radius - point[0], 0)
mask_min_col = max(radius - point[1], 0)
mask_max_row = min(image.shape[0] - point[0] + radius, circle_mask.shape[0])
mask_max_col = min(image.shape[1] - point[1] + radius, circle_mask.shape[1])
temp_mask = circle_mask[mask_min_row:mask_max_row, mask_min_col:mask_max_col]
image[image_min_row:image_max_row, image_min_col:image_max_col][temp_mask] = 1
. , , . , . -, , , point , .
NumPy ?