Access the number of queued items in the Densetet TensorFlow API

I am changing the TensorFlow code from the old queue interface to the new Dataset API. Using the old interface, I could control the actual size of the filled queue by accessing the raw counter in the graph, for example. in the following way:

queue = tf.train.shuffle_batch(...,  name="training_batch_queue")
queue_size_op = "training_batch_queue/random_shuffle_queue_Size:0"
queue_size = session.run(queue_size_op)

However, with the new Dataset API, I cannot find any variables in the graph associated with the queues / data sets, so my old code no longer works. Is there a way to get the number of items in a queue using the new dataset API (e.g., in a queue tf.Dataset.prefetchor tf.Dataset.shuffle)?

It is important for me to keep track of the number of elements in the queue, since this tells me about the behavior of the preprocessing in the queues, including the preprocessing or the remainder (for example, a neural network) is a bottleneck in speed.

+4
source share
1 answer

As work around, you can save a counter to indicate how many items are in the queue. Here's how to define a counter:

 queue_size = tf.get_variable("queue_size", initializer=0,
                              trainable=False, use_resource=True)

Then, when the preprocessing data (for example, in a function dataset.map), we can increase this counter:

 def pre_processing():
    data_size = ... # compute this (could be just '1')
    queue_size_op = tf.assign_add(queue_size, data_size)  # adding items
    with tf.control_dependencies([queue_size_op]):
        # do the actual pre-processing here

Then we can decrease the counter every time we run our model with a batch of data:

 def model():
    queue_size_op = tf.assign_add(queue_size, -batch_size)  # removing items
    with tf.control_dependencies([queue_size_op]):
        # define the actual model here

, , - queue_size , , , .. :

 current_queue_size = session.run(queue_size)

( API Dataset), .

0

Source: https://habr.com/ru/post/1689618/


All Articles