Descending index segment without loss of measurement information

I am using numpy and want to index a row without losing size information.

import numpy as np X = np.zeros((100,10)) X.shape # >> (100, 10) xslice = X[10,:] xslice.shape # >> (10,) 

In this example, xslice is now 1 size, but I want it to be (1,10). In R, I would use X [10,:, drop = F]. There is something similar in numpy. I could not find it in the documentation and did not see a similar question.

Thank!

+71
python numpy
Aug 23 2018-10-10T00:
source share
6 answers

Perhaps the easiest way is to make x[None, 10, :] x[np.newaxis, 10, :] x[None, 10, :] or the equivalent (but more readable) x[np.newaxis, 10, :] .

As far as this is by default, I personally find that constantly having arrays with singleton sizes is very annoying very quickly. I guess numpy devs felt the same way.

In addition, multilevel processing of broadcast arrays is very convenient, so there is usually no reason to maintain the dimension of the array from which the slice is obtained. If you have done this, then things like:

 a = np.zeros((100,100,10)) b = np.zeros(100,10) a[0,:,:] = b 

either will not work, or it will be much more difficult to implement.

(Or at least my suggestion that numpy dev is talking about reducing size information when cutting)

+44
Aug 23 '10 at 21:30
source share

Another solution is

 X[[10],:] 

or

 I = array([10]) X[I,:] 

The dimension of the array is preserved when indexing is performed by a list (or array) of indices. This is nice because it leaves you with a choice between size preservation and compression.

+73
Aug 12 '13 at 9:05 on
source share

I found some reasonable solutions.

1) use numpy.take(X,[10],0)

2) use this weird indexing X[10:11:, :]

Ideally, this should be the default value. I never understood why the sizes are falling. But this is a discussion for numpy ...

+26
Aug 23 '10 at 9:26 a.m.
source share

Here I like the alternative more. Instead of indexing with a single number, an index with a range. That is, use X[10:11,:] . (Note that 10:11 does not include 11).

 import numpy as np X = np.zeros((100,10)) X.shape # >> (100, 10) xslice = X[10:11,:] xslice.shape # >> (1,10) 

This makes it easy to understand, with a large number of sizes, there is also no None juggling and not figuring out which axis to use which index. Also, you do not need to do additional bookkeeping regarding the size of the array, just i:i+1 for any i that you would use for normal indexing.

 b = np.ones((2, 3, 4)) b.shape # >> (2, 3, 4) b[1:2,:,:].shape # >> (1, 3, 4) b[:, 2:3, :].shape . # >> (2, 1, 4) 
+2
Mar 19 '19 at 18:19
source share

To add to a solution that includes indexing with lists or arrays using gnebehay, you can also use tuples:

 X[(10,),:] 
+1
Mar 20 '19 at 16:25
source share

This is especially annoying if you are indexing an array that may be 1 at runtime. For this case, there is np.ix_ :

 some_array[np.ix_(row_index,column_index)] 
0
Apr 16 '19 at 18:59
source share



All Articles