Tensor flow: Softmax cross-entropy with logits becomes inf

I am working on Tensorflow for poets . Most of the time, training is interrupted with an error Nan in summary histogram. I run the following command for initial data for retraining:

python -m scripts.retrain   
   --bottleneck_dir=tf_files/bottlenecks   
   --model_dir=tf_files/models/   
   --summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}"   
   --output_graph=tf_files/retrained_graph.pb   
   --output_labels=tf_files/retrained_labels.txt  
   --image_dir=/ml/data/images

This error occurred in other references . I followed the instructions there using tfdg, which gave me a little more understanding (see below). However, I am still stuck because I don’t know why this is happening and what I can do to fix it without much experience in TF and neural networks. This is especially embarrassing because it happens with 100% training code and data.

Here is the result from tfdg. When the error first appears:

tfdg output for node with error Detail node:

enter image description here

script, Google . . , ( ).

Hyper : (, 0,000001). . , , (, ).

+4
2

, 2.7, , 3.5 - tensorflow. python 3.5. , .

+1

, tf_files? python. spyder , retrain.py, . , , .

+1

Source: https://habr.com/ru/post/1693833/


All Articles