Matmouth with different rank

I have 3
Xshape tensors (1, c, h, w), suppose a (1, 20, 40, 50)
Fxshape (num, w, N), suppose a (1000, 50, 10)
Fyshape (num, N, h), suppose(1000, 10, 40)

What I want to do is Fy * (X * Fx)( *mean matmul)
X * Fxshape (num, c, h, N), suppose (1000, 20, 40, 10)
Fy * (X * Fx)shape (num, c, N, N), suppose(1000, 20, 10, 10)

I use tf.tileand tf.expand_dimsto do this,
but I think it uses a lot of memory ( tilecopy data correctly?), And try to find a better way to speed things up and use a small memory to execute

# X: (1, c, h, w)
# Fx: (num, w, N)
# Fy: (num, N, h)

X = tf.tile(X, [tf.shape(Fx)[0], 1, 1, 1])  # (num, c, h, w)
Fx_ex = tf.expand_dims(Fx, axis=1)  # (num, 1, w, N)
Fx_ex = tf.tile(Fx_ex, [1, c, 1, 1])  # (num, c, w, N)
tmp = tf.matmul(X, Fxt_ex)  # (num, c, h, N)

Fy_ex = tf.expand_dims(Fy, axis=1)  # (num, 1, N, h)
Fy_ex = tf.tile(Fy_ex, [1, c, 1, 1])  # (num, c, N, h)
res = tf.matmul(Fy_ex, tmp) # (num, c, N, N)
+4
source share
2 answers

einsum, :

>>> import numpy as np
>>> X = np.random.rand(1, 20, 40, 50)
>>> Fx = np.random.rand(100, 50, 10)
>>> Fy = np.random.rand(100, 10, 40)
>>> np.einsum('nMh,uchw,nwN->ncMN', Fy, X, Fx).shape
(100, 20, 10, 10)

tf, numpy ( tf, ). , , , .

+2


, @phg, , num h w , .. None
, tf.einsum r1.0 , None

, issue pull
, , None ( )
,

, tf.einsum


, ( ) tf.einsum
tf.einsum,

0

Source: https://habr.com/ru/post/1671811/


All Articles