50/50 tensor prediction prediction and loss starts very high

My predictions are always [0.5.0.5], my weights are bouncing, my loss starts ~ 30 and slowly decreases.

Model conv-> pool-> conv-> pool-> fullrelu-> full (softmax with logs later).

It works great with TFLearn.

How I calculated the loss (I mean its sparse _softmax, how could it go to 0.5 0.5, when it is sparse for only one truth label):

def calcLoss(logits,train_labels_node,WFully,BFully): loss = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits( labels=train_labels_node, logits=logits)) return loss 

How I calculated Optimizer:

 def calcOptimizer(learning_rate,loss): return tf.train.AdamOptimizer(learning_rate).minimize(loss) 

Losses

enter image description here

Edit: after playing with option and adding:

 tf.train.create_global_step() learningrate = tf.train.exponential_decay( 0.01, # Base learning rate. tf.train.get_global_step() * Constants.BATCH_SIZE, train_size, # Decay step. 0.95, # Decay rate. staircase=True) optimizer = tf.train.AdamOptimizer(..).minimize(loss, global_step= tf.train.get_global_step()) 

Now it works.

+5
source share

Source: https://habr.com/ru/post/1273829/


All Articles