Chat failure due to binary data from Code Execution

There is a conflict between the Python Sandbox and the UI file manager.

  1. Code execution successfully generates a binary file (for me it was octet-stream).

  2. The UI allows downloading this file.

  3. However, the internal file indexer/tokenizers seem to crash or hang when trying to process this file in the background (since it’s not a text/image/pdf).

  4. This leads to the Code Execution environment becoming unresponsive.

User can’t upload such files, but the system allows their creation, yet fails to handle them gracefully after creation.

Hi @CiNoP ,

Welcome to Forum!!
Thanks for letting us know about the problem! To help us replicate the issue and find a fix, could you please share the steps to reproduce it?

  1. In AI Studio, enable the code execution function
  2. Ask Gemini to use the Python interpreter to generate your binary file. In my case, it’s a small neural network for digit recognition
  3. If you ask Gemini to save the neural network file so you don’t have to wait 15 seconds each time to train it, the file will automatically appear in your chat
  4. Code Execution is now unavailable to you

chat link:

And let me clarify: you can’t even continue the chat with such a file. I asked to generate a .docx file, and that also broke the chat