I am currently working on a quaternion neural network using Tensorflow (I want to use GPUs). TensorFlow does not support quaternions, but you can imagine it as a real 4x4 matrix, so it is possible to create such a neural network in TensorFlow.
Is there an easy way to add a custom operation or perform a custom operation on tensors?
For example, I can write:
output_activation = tf.nn.softmax(tf.matmul(hidden_activation, Weight_to_ouput))
... and that's pretty cool! All you have to do is add a loss function, and then backtrack. However, I want to do the same, but with quaternions, for example:
output_activation = mySigmoid(myFunction(hidden_activation, Weight_to_output))
However, I need to convert quaternions to and from tensors in order to optimize GPU computing. Therefore, I need to create a function that receives some tensors as parameters and returns the transformed tensors. I looked at py_func, but it seems that you cannot return tensors.
I tried the following but it failed:
def layerActivation(inputTensor,WeightTensor):
newTensor = tf.matmul(inputTensor,WeightTensor)
return newTensor
... and in main():
x = placeholder ...
W_to_hidden = tf.Variable
test = tf.py_func(layerActivation, [x,_W_to_hidden], [tf.float32])
with tf.Session() as sess:
tf.initialize_all_variables().run()
king_return = sess.run(test, feed_dict={x: qtrain})
Error: Not implemented: Unsupported object type Tensor
Ideally, I could use this output_activationin the standard TensorFlow backprop algorithm, but I don't know if this is possible.