One of my chat window with a long chat history wouldn’t work any more. I noticed the behaviour in last couple of days. The chat window opens but the Token counter at the top, next to the chat title just spins around. It wouldn’t show the numbers. The prompt run button is not enabled until the token count is loaded. Hence the window has become unusable. I can not access the context which is necessary for content generation because of this nor can I use it for content generation. I have seen the same for a few other chat window that has a longer chat history and hence likely high token count. Where as a fairly new window with relatively smaller token count works upto 4 digit number!
Any one else is facing this and have resolved it? If so, how?
What is the best way to externalise the context necessary to generate the content to avoid this from happening ? Its not only the context its the other historically generated content
AI Studio runs entirely in your browser. As your chat history grows, the browser has to render and manage an increasingly large number of elements in the user interface. Also chat history gets stored in your google drive associated with your account.
Please clear your cache memory in browser and see if your storage in google drive is full.
I have token counts close to 100K in my account and it never gave issues as I keep freeing-up browser memory and gdrive.
It is a good practice to have lower context windows for seemless usage. If you are still facing throttling issues, instead of letting the conversation grow indefinitely, periodically ask Gemini to summarize the key points, decisions, and important information from the chat so far. Then open a new chat and enter the summary. This keeps the token count low while preserving the essential information.
Hope these tips helps resolve your issue. Have a nice day
同样的问题,也是最近一周的时间发现的,非常苦恼,最本质的问题是token统计的进程持续进行,导致无法新增回复,无法发送新信息,无法总结,新建对话不是目的,是这个对话已经沉淀了很多讨论,无比珍贵。我企图用这个方式:Branch from here,来复制一份,达到重新统计token的目的,发现,新的对话,依然失败。就很烦躁