Creating image fragments (m * n) of the original image using Python and Numpy

I use numpy to create plates (224 * 224) from my 16 bit tiff image (13777 * 16004). I was able to trim / slice equal tiles sized 224 * 224 along rows and columns. I ran into problems when trying to create new tiles shifted by half the size of the tile ... For example: a crude algorithm of what I'm trying to achieve

(1: 224, 1: 224)

(1: 224, 112: 336)

(, 224: 448)

The goal is to preserve the size of the tile (224 * 224) while shifting half the size of the tile to get more fragments of the image ...

A snippet of code written to complete a task

row_x = img.shape[0] column_y = img.shape[1] tile_size_x = 224 tile_size_y = 224 range_x = mpz(ceil(row_x/tile_size_x)) range_y = mpz(ceil(column_y/tile_size_y)) for x in range(range_x, row_x): for y in range(range_y, column_y): x0 = x * tile_size_x x1 = int(x0/2) + tile_size_x y0 = y * tile_size_y y1 = int(y0/2) + tile_size_y z = img[x0:x1, y0:y1] print (z.shape,z.dtype) 

I keep getting the wrong results, can anyone help?

+1
source share
2 answers

You are a bit off calculating the range of the for loop. The number of slices that need to be done should be calculated using the offset between the two slices, which in your case x0/2 , I simplified your code and defined some significant variables that you can configure to get the desired slices from the given image:

enter image description here

 import cv2 import math img = cv2.imread("/path/to/lena.png") # 512x512 img_shape = img.shape tile_size = (256, 256) offset = (256, 256) for i in xrange(int(math.ceil(img_shape[0]/(offset[1] * 1.0)))): for j in xrange(int(math.ceil(img_shape[1]/(offset[0] * 1.0)))): cropped_img = img[offset[1]*i:min(offset[1]*i+tile_size[1], img_shape[0]), offset[0]*j:min(offset[0]*j+tile_size[0], img_shape[1])] # Debugging the tiles cv2.imwrite("debug_" + str(i) + "_" + str(j) + ".png", cropped_img) 

As the current offset, if the exact multiple is the size of the image, which is 512x512, therefore, we get 4 tiles of the same size:

enter image description here enter image description here enter image description here enter image description here

By changing the offset value, you will get tiles of the wrong size, if the offset, if it is not exact, is a multiple of the image size, later you can filter these fragments if it is not necessary, changing math.ceil to math.floor in for .

0
source

You can use as_strided to do this quite efficiently, I think.

 def window_nd(a, window, steps = None): ashp = np.array(a.shape) wshp = np.array(window).reshape(-1) if steps: stp = np.array(steps).reshape(-1) else: stp = np.ones_like(ashp) astr = np.array(a.strides) assert np.all(np.r_[ashp.size == wshp.size, wshp.size == stp.size, wshp <= ashp]) shape = tuple((ashp - wshp) // stp + 1) + tuple(wshp) strides = tuple(astr * stp) + tuple(astr) as_strided = np.lib.stride_tricks.as_strided aview = as_strided(a, shape = shape, strides = strides) return aview 

EDIT: Generalize the step method as much as possible.

For your specific question:

 aview = window_nd(a, (288, 288), (144, 144)) z = aview.copy().reshape(-1, wx, wy) #to match expected output print(z.shape, z.dtype) # z.shape should be (num_patches, 288, 288) 
+1
source

Source: https://habr.com/ru/post/1274885/


All Articles