How to optimize useless tensors in memory

During the process of using tensor, I find that I seem to waste too much memory. A large number of reshapes and numpy to tensor, and tensor to numpy without clearing the memory seem to be a burden to my program. How can I clean up these useless tensors? Just set them to none?

Hi @jun_yin ,

Here are some strategies to help manage memory more effectively when working with TensorFlow,

  1. Explicit garbage collection:Python’s garbage collector usually handles memory management, but you can explicitly call it.
  2. Using tf.keras.backend.clear_session(): This function clears the current TensorFlow graph and free up memory.
  3. Use with statements for temporary tensors: It automatically manage resources like tensors. The resources are automatically released when the context exits.
  4. Use tf.function for better performance: This can help reduce memory usage by optimizing operations.
  5. Control Memory Growth : If you are using a GPU, you can set memory growth to prevent TensorFlow from consuming all the available GPU memory at once.

Hope This Helps ,

Thank You .