Too many input tokens error

today (or more precisely ~6 minutes ago) i started getting an error that reads “Failed to generate content, too many input tokens: 131085 exceeds the limit of 131072. Please adjust your prompt and try again.”. i personally do not know what could be causing this as the conversation barely has 150,500 tokens; while i’ve had conversations that nearly reach 600,000 tokens. if anyone knows how to fix this (if it is possible to fix it; unless if it’s a bug within AI studio) i’d appreciate it.

1 Like

Hi @ostauemayer,

Welcome to the Google AI Forum! :confetti_ball: :confetti_ball:

To help you better, can you please specify which platform and model are you using?

AI studio web (free tier); Gemini 2.5 Pro, but oddly enough, once i switched to 2.5 Flash, it gave me the survey to pick between two responses, and i picked one. it generated fine and kept working until i refreshed the page; and all replies from that point had disappeared and rerunning the prompt kept giving the same ‘Too many input tokens’

The issue is still not fixed, and in a separate prompt (which has ~590 thousand tokens) it just says that it failed to generate, with no error reason given. it would usually say “permission denied” or “quota exceeded” but now just says nothing. i don’t know what could be causing this kind of problem after problem with specific prompts considering ive had one with 755 thousand tokens that still works perfectly fine..

1 Like

Hi @ostauemayer,

Apologies for the delay.. Are you still facing this issue?

1 Like

I am also facing this issue

1 Like

Same issue here. Occurred a coiuple of minutes ago. Can’t post much of anything in this specific branch with Gem Pro 2.5 now…

1 Like

any solution to this? aside from deleting earlier parts of the conversation that is? because it kinda makes branches/lengthy projects useless as it stands.

1 Like

I am also facing the same issue, and it’s not fixed. Please fix if possible.

Facing the same issue …


error goes away when I remove the ‘exceeds’ token count from the chat…
copying or branching doesn’t fix it. :frowning:

Any update for a fix on this. It is extremely limiting.

What do you mean?

I’m having the same issue here. There was actually an outage today according to Google AI Studio , but the page says it’s been resolved. I was able to send a couple messages but now I’m at 534,214 tokens and it went back to saying “An internal error has occurred.” with the popup saying “Failed to generate content, too many input tokens: 1048989 exceeds the limit of 1048576. Please adjust your prompt and try again.” I also moused over the tokens and it says this:

Token Usage: 534214 / 1048576

Input tokens: 534,214

Output tokens: 155,084

Total tokens: 689,298

I’m working on a lengthy project for work and I’ve been using this one conversation so not being able to continue the conversation kind of slows down my ability to work with the code. If I were to make a new conversation I would have to manually go back through the conversation and select individual parts I would reupload to the new conversation and that would take too long. I would be fine with this if I was approaching one million it’s just that I wasn’t expecting it to “hit the limit” so soon.

Same issue here UI shows 486,228 tokens. But I get an error message stating “Failed to generate content, too many input tokens: 1143523 exceeds the limit of 1048576. Please adjust your prompt and try again.”

I might have hit the limit, but please do let me know if it is a UI issue or the google service is down.

It’s not an outage issue - it’s been going on for months. As soon you hit around half a million tokens, Studio refuses to process any further requests.

I’m annoyed now because I’ve reached a point where I can’t delete anything else from the prompt history because the LLM will lose context - which I need it to maintain or else the whole exercise is useless (I’m not exactly stretching this thing, it’s just checking some pdf’s for document consistency, but it has been massive time saver for the exercise).

This issue renders branching borderline useless, and long term projects (like mine) aren’t really viable. I’m thinking of switching to Qwen, which doesn’t have this issue, but it is miles slower and not quite as good at what I need…

:waving_hand:

Any updates on this issue?

Ok, hitting the samne hard limit at 420,000 tokens now on Gem 3 (though I’m not getting the same error now - just a straight up “Failed to generate contentr error” which is persistant no matter how I often I retry the post).

I’m really hoping this isn’t some sort of paywall issue…

ok, so i’ve got a thread going that seems to be working, and i’m at over 600,000 tokens. it appears pretty obvious now that my access to it yesterday was being limited by a daily usage limit of some kind. i don’t think this is a technical issue anymore, rather one of clarifying reasons and limits to your end users in a far more satisfactory manner…