Writing a custom cost function in a tensor stream

I am trying to write my own cost function in a tensor flow, but apparently I can not "cut" the tensor object?

import tensorflow as tf import numpy as np # Establish variables x = tf.placeholder("float", [None, 3]) W = tf.Variable(tf.zeros([3,6])) b = tf.Variable(tf.zeros([6])) # Establish model y = tf.nn.softmax(tf.matmul(x,W) + b) # Truth y_ = tf.placeholder("float", [None,6]) def angle(v1, v2): return np.arccos(np.sum(v1*v2,axis=1)) def normVec(y): return np.cross(y[:,[0,2,4]],y[:,[1,3,5]]) angle_distance = -tf.reduce_sum(angle(normVec(y_),normVec(y))) # This is the example code they give for cross entropy cross_entropy = -tf.reduce_sum(y_*tf.log(y)) 

I get the following error: TypeError: Bad slice index [0, 2, 4] of type <type 'list'>

+5
source share
4 answers

Currently, the tensor flow cannot be collected along axes other than the first - he requested .

But for what you want to do in this particular situation, you can transpose and then collect 0.2.4 and then transfer it back. It will not be insanely fast, but it works:

 tf.transpose(tf.gather(tf.transpose(y), [0,2,4])) 

This is a useful solution for some limitations in the current collection implementation.

(But it’s also true that you cannot use numpy slice in the tensor stream node - you can run it and cut the output, as well as that you need to initialize these variables before starting. :). You mix tf and np so that it doesn't work.

 x = tf.Something(...) 

- object of the tensor flow graph. Numpy does not know how to handle such objects.

 foo = tf.run(x) 

returns to an object that python can handle.

Usually you want to keep the loss calculation in a pure tensor flow, so that cross and other functions in tf. You will probably have to do arccos a long way, since tf has no function for it.

+6
source

just realized that the following failed:

 cross_entropy = -tf.reduce_sum(y_*np.log(y)) 

you cannot use numpy functions for tf objects, and indexing my will also be different.

0
source

I think you can use the "Wraps Python function" method in a tensor stream. Here is a link to the documentation.

And as for the people who answered: β€œWhy don't you just use the tensor flow construction function to build it?” - sometimes the cost function that people are looking for cannot be expressed in tf-functions or extremely complex.

0
source

This is due to the fact that you did not initialize your variable, and because of this, it does not have your tensor there right now (you can read in my answer here )

Just do something like this:

 def normVec(y): print y return np.cross(y[:,[0,2,4]],y[:,[1,3,5]]) t1 = normVec(y_) # and comment everything after it. 

To see that you don’t have a tensor now and only Tensor("Placeholder_1:0", shape=TensorShape([Dimension(None), Dimension(6)]), dtype=float32) .

Try initializing variables

 init = tf.initialize_all_variables() sess = tf.Session() sess.run(init) 

and evaluate the sess.run(y) variable. Postscript you still haven't fed your placeholders.

-1
source

Source: https://habr.com/ru/post/1235847/


All Articles