I create a neural network in Tensorflow. I am using the tf.layers module. For some reason, in the graph visualization, I see “uninitialized report variables” associated with each part of my graph.
Does anyone have an explanation? Is this related to get_variable and variable_scope methods?
The graph seems to work. I'm just trying to understand the meaning of these nodes. I am not sure if this is due to the fact that I am using MonitoredTrainingSession.
This seems to be related to all variables, including the optimizer.
https://i.stack.imgur.com/ySFM5.png
This is an init node type, but it seems to say noop, not sure if the proper initialization is being done by MonitoredTrainingSession. The strange thing is that the schedule is still working and there is no “Initialization Error”. https://i.stack.imgur.com/umrRA.png
source
share