What is the difference between tf.group and tf.control_dependencies?

Besides tf.control_dependencies , which is the context manager (i.e. used with Python with ), what is the difference between tf.group and tf.control_dependencies ?

When should i use?

Does tf.group really have no particular order of operations? I would suggest that tf.group([op_1, op_2, op_3]) executes ops in list order, but maybe it is not? No behavior specified in docstring.

+12
source share
2 answers

If you look at graphdef, c=tf.group(a, b) produces the same graph as

 with tf.control_dependencies([a, b]): c = tf.no_op() 

There is no specific order in which ops operations will be performed, TensorFlow tries to execute operations as soon as possible (i.e. in parallel).

+16
source

Just adding a few minor points to the answer @ Yaroslav-Bulatov.

As can be seen from the answer of Yaroslav:

  • tf.control_depenencies alone does not create any operations and adds dependencies to any operations that you create within your scope.
  • tf.group creates one tf.group (of type NoOp ), adds dependencies to this operation.

More importantly, if the tf.group arguments tf.group to multiple devices, tf.group will insert an intermediate layer between its inputs and the returned node. This layer will contain one node per device, so dependencies are organized by device. This can reduce the flow of data between devices.

Therefore, if your dependencies are on multiple devices, tf.group adds (possibly critical) optimization.

tf.control_dependencies , tf.control_dependencies other hand, supports nesting: the internal context will add dependencies to the union of all operations in external contexts.

0
source

Source: https://habr.com/ru/post/1014286/


All Articles