Just adding a few minor points to the answer @ Yaroslav-Bulatov.
As can be seen from the answer of Yaroslav:
tf.control_depenencies alone does not create any operations and adds dependencies to any operations that you create within your scope.tf.group creates one tf.group (of type NoOp ), adds dependencies to this operation.
More importantly, if the tf.group arguments tf.group to multiple devices, tf.group will insert an intermediate layer between its inputs and the returned node. This layer will contain one node per device, so dependencies are organized by device. This can reduce the flow of data between devices.
Therefore, if your dependencies are on multiple devices, tf.group adds (possibly critical) optimization.
tf.control_dependencies , tf.control_dependencies other hand, supports nesting: the internal context will add dependencies to the union of all operations in external contexts.
source share