Convert Index Vector to Matrix

I want to convert an index vector to a matrix with units in index columns.

x = [2;1;3;1]; m = someFunc(x,3) % m = % % 0 1 0 % 1 0 0 % 0 0 1 % 1 0 0 
+6
source share
4 answers

One way is to use the SUB2IND function:

 colN = 3; assert(max(x)<=colN,'Not enough columns') %# check that you have enough columns %# other checks that x is valid indices m = zeros(numel(x),colN); m(sub2ind(size(m),1:numel(x),x')) = 1; 
+3
source

I tested the sub2ind function, but on the Courcera Machine Learning forum, I pointed out this beauty.

 m = eye(num_cols)(x,:); 

It uses an identity matrix to select the appropriate column based on the value in x.

+15
source

I had a very similar question, so I did not want to open a new one. I wanted to convert an index row vector into a matrix with units in the rows (instead of columns) of indices. I could use the previous answer and invert it, but I thought this would work better with very large matrices.

 octave> x = [2 1 3 1]; octave> m = setRowsToOne(x, 3) m = 0 1 0 1 1 0 0 0 0 0 1 0 

I could not figure out how to use sub2ind to accomplish this, so I figured it out myself.

 function matrixResult = setRowsToOne(indexOfRows, minimumNumberOfRows) numRows = max([indexOfRows minimumNumberOfRows]); numCols = columns(indexOfRows); matrixResult = zeros(numRows, numCols); assert(indexOfRows > 0, 'Indices must be positive.'); matrixResult(([0:numCols-1]) * numRows + indexOfRows) = 1; end x = [2 1 3 1]; m = setRowsToOne(x, 3) 
+1
source

You can use accumarray , which makes it very simple:

 accumarray([ (1:length(x))', x ], 1, [4, 3]) 

Part 1:length(x) indicates which rows they go to, and x , which columns.

0
source

Source: https://habr.com/ru/post/912068/


All Articles