Vectorizing numpy.einsum

I have the following four tensors

  • H (h, r)
  • A (a, r)
  • D (d, r)
  • T (a, t, r)

For each i in a there exists a corresponding T[i] form (t, r) .

I need to do np.einsum to get the following result ( pred ):

 pred = np.einsum('hr, ar, dr, tr ->hadt', H, A, D, T[0]) for i in range(a): pred[:, i:i+1, :, :] = np.einsum('hr, ar, dr, tr ->HADT', H, A[i:i+1], D, T[i]) 

However, I want to do this calculation without using a for loop. The reason is because I use autograd , which currently does not work with element assignment!

+5
source share
1 answer

One way is to use all sizes for T -

 np.einsum('Hr, Ar, Dr, ATr ->HADT', H, A, D, T) 

Since we need to summarize the axis of the r axis along all inputs, while preserving all the others (axes) in the output, I see no intermediate way to do this / implement any point-based tools on this for using BLAS.

+3
source

Source: https://habr.com/ru/post/1273778/


All Articles