Best practices for numpy matrix algebra

My question relates to the last line below: mu@sigma@mu. Why does this work? Is a one-dimensional ndarray a processed row vector or column vector? In any case, should not be mu.T@sigma@muor mu@sigma@mu.T? I know that mu.Tit still returns mubecause it muhas only one dimension, but still the interpreter seems too smart.

>> import numpy as np
>> mu = np.array([1, 1])
>> print(mu)

[1 1]

>> sigma = np.eye(2) * 3
>> print(sigma)

[[ 3.  0.]
 [ 0.  3.]]

>> mu@sigma@mu

6.0

More generally, what is the best practice for matrix algebra in python: use ndarray and @to do matrix multiplication as above (cleaner code), or use np.matrixand overloaded *as below (mathematically less confusing)

>> import numpy as np
>> mu = np.matrix(np.array([1, 1]))
>> print(mu)

[[1 1]]

>> sigma = np.matrix(np.eye(2) * 3)
>> print(sigma)

[[ 3.  0.]
 [ 0.  3.]]

>> a = mu * sigma * mu.T
>> a.item((0, 0))

6.0
+4
source share
2

Python :

In [32]: mu=np.array([1,1])
In [33]: sigma= np.array([[3,0],[0,3]])
In [34]: mu@sigma@mu
Out[34]: 6

:

In [35]: temp=mu@sigma
In [36]: temp.shape
Out[36]: (2,)
In [37]: temp@mu
Out[37]: 6

() , @ np.dot. . 1- -. :

In [38]: mu.dot(sigma).dot(mu)
Out[38]: 6
In [39]: mu.dot(sigma).shape
Out[39]: (2,)

1d 2d np.dot @ . .

numpy , 0d, 1d . np.dot / .

np.matrix , , MATLAB. 2d ( , 1990- MATLAB). __mat__ (*)

def __mul__(self, other):
    if isinstance(other, (N.ndarray, list, tuple)) :
        # This promotes 1-D vectors to row vectors
        return N.dot(self, asmatrix(other))
    if isscalar(other) or not hasattr(other, '__rmul__') :
        return N.dot(self, other)
    return NotImplemented

Mu*sigma Mu@sigma ,

In [48]: Mu@sigma@Mu
...
ValueError: shapes (1,2) and (1,2) not aligned: 2 (dim 1) != 1 (dim 0)

Mu*sigma (1,2), a (1,2), , , :

In [49]: Mu@sigma@Mu.T
Out[49]: matrix([[6]])

, (1,1) -. item, . ( MATLAB , . /.)

@ Python numpy. Python . numpy (, , ) .

, dot [38]. .

, np.matrix. ( scipy.sparse.)

" ", - np.einsum.


, , , :

In [57]: timeit mu.dot(sigma).dot(mu)
2.79 µs ± 7.75 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
In [58]: timeit mu@sigma@mu
6.29 µs ± 31.4 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
In [59]: timeit Mu@sigma@Mu.T
17.1 µs ± 134 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
In [60]: timeit Mu*sigma*Mu.T
17.7 µs ± 517 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)

, "" dot , .

, mu@sigma@mu.T .T, @ (matmul) , mu. . Python, , numpy. , mu.T mu, 1d , .T @.

transpose 2d-. numpy , . , . , reshape np.newaxis .

Octave

>> ones(2,3,4)'
error: transpose not defined for N-d objects

1-d MATLAB .

In [270]: np.ones((2,3,4),int).T.shape
Out[270]: (4, 3, 2)

np.dot , 1d-, "":

  • a b 1-D , ( ).

matmul , .

:

A = [1, 2] B = [3, 5] (2,) ndarray, A @B [1, 2] * [3, 5] '= 13, [ 1, 2] '* [3, 5] = [[3, 5], [6, 10]]

numpy:

In [273]: A = np.array([1,2]); B = np.array([3,5])

(.* MATLAB)

In [274]: A*B
Out[274]: array([ 3, 10])

In [275]: A@B       # same as np.dot(A,B)
Out[275]: 13

In [276]: np.outer(A,B)
Out[276]: 
array([[ 3,  5],
       [ 6, 10]])

:

In [278]: np.sum(A*B)
Out[278]: 13

( ), , :

In [280]: np.einsum('i,i',A,B)
Out[280]: array(13)
In [281]: np.einsum('i,j',A,B)
Out[281]: 
array([[ 3,  5],
       [ 6, 10]])

In [282]: A[:,None]*B[None,:]
Out[282]: 
array([[ 3,  5],
       [ 6, 10]])

, [1, 2]', numpy A[:,None], 1d- - (2d).

, numpy . , MATLAB .:)

@ , :

In [283]: A@B[:,None]       # your imagined A*B'
Out[283]: array([13])

@, , a -, b :

In [284]: A[:,None]@B
ValueError: shapes (2,1) and (2,) not aligned: 1 (dim 1) != 2 (dim 0)
In [285]: A[:,None]@B[None,:]
Out[285]: 
array([[ 3,  5],
       [ 6, 10]])

2d, , 1, @/dot . 1 2, MATLAB , , MATLAB .

Octave , n-d 2d. n-d numpy.

>> ones(2,3,4) * ones(2,3,4)
error: operator *: nonconformant arguments (op1 is 2x12, op2 is 2x12)
+2

, 1d- one_d 2d- two_d, @ 1d- , -:

one_d @ two_d # gives a row vector (that is returned as 1d array)
two_d @ one_d # gives a column vector (also returned as 1d array)

1d (n, 1) (1, n), , .

, , .

+1

Source: https://habr.com/ru/post/1692850/


All Articles