Remove nodes from graph or reset entire default graph

PythonTensorflow

Python Problem Overview


When working with the default global graph, is it possible to remove nodes after they've been added, or alternatively to reset the default graph to empty? When working with TF interactively in IPython, I find myself having to restart the kernel repeatedly. I would like to be able to experiment with graphs more easily if possible.

Python Solutions


Solution 1 - Python

Update 11/2/2016

tf.reset_default_graph()

Old stuff

There's reset_default_graph, but not part of public API (I think it should be, does someone wants to file an issue on GitHub?)

My work-around to reset things is this:

from tensorflow.python.framework import ops
ops.reset_default_graph()
sess = tf.InteractiveSession()

Solution 2 - Python

By default, a session is constructed around the default graph. To avoid leaving dead nodes in the session, you need to either control the default graph or use an explicit graph.

  • To clear the default graph, you can use the tf.reset_default_graph function.

      tf.reset_default_graph()
      sess = tf.InteractiveSession()
    
  • You can also construct explicitly a graph and avoid using the default one. If you use a normal Session, you will need to fully create the graph before constructing the session. For InteractiveSession, you can just declare the graph and use it as a context to declare further changes:

      g = tf.Graph()
      sess = tf.InteractiveSession(graph=g)
      with g.asdefault():
          # Put variable declaration and other tf operation
          # in the graph context
          ....
          b = tf.matmul(A, x)
          ....
    
       sess.run([b], ...)
    

EDIT: For recent versions of tensorflow (1.0+), the correct function is g.as_default.

Solution 3 - Python

Tensorflow 2.0 Compatible Answer: In Tensorflow Version >= 2.0, the Command to Reset Entire Default Graph, when run in Graph Mode is tf.compat.v1.reset_default_graph.

NOTE: The default graph is a property of the current thread. This function applies only to the current thread. Calling this function while a tf.compat.v1.Session or tf.compat.v1.InteractiveSession is active will result in undefined behavior. Using any previously created tf.Operation or tf.Tensor objects after calling this function will result in undefined behavior.

Raises: AssertionError: If this function is called within a nested graph.

Solution 4 - Python

IPython / Jupyter notebook cells keep state between runs of a cell.

Create a custom graph:

def main():
    # Define your model
    data = tf.placeholder(...)
    model = ...

with tf.Graph().as_default():
    main()

Once ran, the graph is cleaned up.

Solution 5 - Python

Not sure if I faced the very same problem, but

tf.keras.backend.clear_session()

at the beginning of the cell in which the model (Keras, in my case) was constructed and trained helped to "cut the clutter" so only the current graph remains in the TensorBoard visualization after repeated runs of the same cell.

Environment: TensorFlow 2.0 (tensorflow-gpu==2.0.0b1) in Colab with built-in TensorBoard (using the %load_ext tensorboard trick).

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionMohammed AlQuraishiView Question on Stackoverflow
Solution 1 - PythonYaroslav BulatovView Answer on Stackoverflow
Solution 2 - PythonThomas MoreauView Answer on Stackoverflow
Solution 3 - PythonTensorflow SupportView Answer on Stackoverflow
Solution 4 - PythonSergeView Answer on Stackoverflow
Solution 5 - PythonJohn DoeView Answer on Stackoverflow