Keras has an abstraction layer keras.backendthat you seem to have already found (you name it K). This layer provides all the features for Theano and TensorFlow that you will need.
Say your TensorFlow code that works
std_var = tf.tile(tf.reshape(tf.exp(log_var), [1, -1]), (tf.shape(Mean)[0], 1))
, :
std_var = K.tile(K.reshape(K.exp(log_var), (1, -1)), K.shape(Mean)[0])
Theano TensorFlow (-1 ), .
, TF. (1, -1), , 0 1. , :
std_var = K.tile(K.reshape(K.exp(log_var), (-1, num)), K.shape(Mean)[0])