Update
I need to provide an update on this issue, because after digging into it further, I’m confident this is a bug in AI Studio, not so much user error.
What happened was essentially a perfect storm. I have a large codebase, and I introduced a major architectural change that touched almost every file. At the same time, I uploaded roughly 150 MB of documentation. That combination appears to have pushed the Code Assistant past its 1,048,576‑token limit.
The real problem is what happened next:
AI Studio provides no way to clear or reset the Code Assistant’s context.
Because of that:
-
I cannot clear the chat history
-
I cannot reset the assistant
-
I cannot remove the codebase
-
Every prompt I enter now immediately triggers the same token‑limit error
-
The assistant is effectively bricked for this project
The only workaround is to fork the entire project just to get a fresh assistant — which means losing two months of accumulated context and having to rebuild my workflow from scratch. And since AI Studio cannot attach an assistant to an existing repo, I can’t simply reconnect it.
I may be mistaken, but I don’t believe AI Studio should behave this way. A single token‑limit overflow shouldn’t permanently break the Code Assistant for an entire project.
********
Hi everyone,
I hit this error in AI Studio. while uploading ~150 MB of data;
Error:
com.google.net.rpc3.util.RpcFutureStream$RpcStreamException: generic::INVALID_ARGUMENT: The input token count exceeds the maximum number of tokens allowed 1048576.
This is the first time I’ve seen it; AI Studio has had intermittent stability problems, so I can’t rule out a platform bug. Has anyone else seen this exact token‑limit error from the Code Assistant during large uploads or similar workflows, or is this unique to my session?
I’m trying to understand whether this is expected behavior, user error, or something others are also experiencing.
com.google.net.rpc3.util.RpcFutureStream$RpcStreamException:
generic::INVALID_ARGUMENT:
The input token count exceeds the maximum number of tokens allowed 1048576.
Context
-
This is the first time I’ve seen this error.
-
At the time it occurred, I was uploading approximately 150 MB of data into the AI Studio environment.
-
I’m currently aware that AI Studio has ongoing stability issues, so I can’t rule out a platform‑side bug or transient failure.
What I’m unsure about
-
I don’t fully understand what input is being counted toward the token limit in this context.
-
I wasn’t explicitly sending a massive prompt to the model at the time — the activity was primarily data upload and code‑related interaction.
-
The token limit mentioned (1,048,576) seems extremely high, which makes me unsure how it was exceeded in this scenario.
Questions
-
Has anyone else encountered this error inside AI Studio’s Code Assistant?
-
Is large file upload known to affect token accounting in this way?
-
Is this a known limitation or a known issue with the current AI Studio environment?
Any insight or confirmation that others are seeing similar behavior would be very helpful.
Thanks!