In my bug report, I requested an error log. There are so many repeated posts on the same issues going back to May; it would make it simpler to track and troubleshoot countermeasures based on time estimates to investigate and resolve, etc. I saw the first response yesterday acknowledging it. Thanks again. ![]()
I also created a new issue on the Google Issue Tracker; I hope it will be fixed.
Your context/token usage is likely a large part of it, the 3-400k/1000k does not mean what you think it does I think, its not how many tokens you have used, its how much your next message costs.
Remember every time you press enter, your sending the entire chat history of that session, everytime, its also why the browser starts lagging more and more after 150k
This problem manifests even between 45-100K, as already stated. So, unfortunately, your hypothesis is invalid at the root.
Not really no, depends on how “big” your average message+AI responce is.
Your actual tokenuse is MUCH higher than you actually think, at 45k you can easily be over 1mil in actual tokenuse because you send your entire chat history + your new message everytime you press enter, but your certently whitin the area the AI not only begins to hallucinate, but also begin to forget stuff, well, not forgot, but leave stuff out because of the amount of information it has to shift through
It happens even on a freshly made account.
P.S. This problem happens mainly with 2.5 Pro and much less with 2.5 Flash.
Any news from the dev team, @Lalit_Kumar?
The current absence of news is troublesome…
This issue already existed when I was using Gemini Flash 2.0, and it still persists now. Such problems mostly occur when the context tokens are large, and the chat records at this stage often fail to be saved. Sometimes, rerunning can bring up the thinking mode.
Unfortunately, this also happens on fresh accounts, in new chats with only 45k tokens and no reruns or deletions, and even with small context messages.
More than one month later and the problem is still unresolved and blatant.