I took the enclosed Abalone example and made sure that I understood it ... Well, I think I know. But since another evaluation project that I am working on is producing complete garbage - I tried to add a tensor board, so I can understand what is happening.
Base code https://www.tensorflow.org/extend/estimators
I added a session and a writer
model_params = {"learning_rate": 0.01}
with tf.Session () as sess:
nn = tf.contrib.learn.Estimator(model_fn=model_fn, params=model_params)
writer = tf.summary.FileWriter ( '/tmp/ab_tf' , sess.graph)
nn.fit(x=training_set.data, y=training_set.target, steps=5000)
ev = nn.evaluate(x=test_set.data, y=test_set.target, steps=1)
And added 1 line in the model_fn function so it looks like this...
def model_fn(features, targets, mode, params):
"""Model function for Estimator."""
first_hidden_layer = tf.contrib.layers.relu(features, 49)
second_hidden_layer = tf.contrib.layers.relu(first_hidden_layer, 49)
output_layer = tf.contrib.layers.linear(second_hidden_layer, 1)
predictions = tf.reshape(output_layer, [-1])
predictions_dict = {"ages": predictions}
loss = tf.losses.mean_squared_error(targets, predictions)
eval_metric_ops = {
"rmse": tf.metrics.root_mean_squared_error(
tf.cast(targets, tf.float64), predictions)
}
train_op = tf.contrib.layers.optimize_loss(
loss=loss,
global_step=tf.contrib.framework.get_global_step(),
learning_rate=params["learning_rate"],
optimizer="SGD")
tf.summary.scalar('Loss',loss)
return model_fn_lib.ModelFnOps(
mode=mode,
predictions=predictions_dict,
loss=loss,
train_op=train_op,
eval_metric_ops=eval_metric_ops)
Finally added
writer.close()
When I run the code ... I get the data file in / tmp / ab _tf, this file is NOT null. But it is also only 139 bytes ... which means that nothing is being written ....
When I open it with a tensor board - no data.
What am I doing wrong?
Rate any input ...