No new conversation gets saved after certain point and token usage not showing

I am using gemini on ai studio, and having this very peculiar problem. to start with the token count has been flashing since very early on, far from 1 million token and I can not see how many i have used. To add to that now I lose the entire new conversation I make in a day so when I restart the computer I am back to where I started the session it is same day, same hour each time. What do I do here? It has happened multiple times, do I need to start a new session is there a fix, why token counts not showing it is most important so you know when and how you can wrap up

2 Likes

it’s this old incomplete-fixed issue: BUG : Stuck on Please wait for content to load - #110 by bob_zhang

Basically, to save, make a copy or a branch. If you make a copy, the new copy will ahve all the content. Then you can delete the old one. If you make a branch, both the old conversation and branch will have the new content saved. Then you can delete the new branch.

That is not the issue at all. for this session that i am corresponding and working with, it responds to me as normal. But if I exit and come back no matter now much conversation we have had is gone. seems like meta data is messed up somehow! And token usage is not showing. My project is huge hundreds of files and tens of thousands of lines of code, no matter what I need new sessions all the time, but need to know and be prepared for it. Branching will transfer everything so we will very soon fill up. Though branching it at the end of the session is a very basic patch it is far from a solution. If the internet is interrup or some how i exit the session all is lost, and I am doing a ton of work. This is a problem that needs fixing and does not happen all the time, must at the least see the token_usage without it I will not know to transfer the knowledge and start a fresh session which requires fresh knowledge transfer without the massive noise of debugging and coding of the previous session which disables the model from thinking well, most understandably we are dealing with massive amount of code.

Yes, that is the very issue discussed in that other bug thread. When conversation get long, token count starts to not work properly. If you close that conversation and come back to it, it won’t show the total token at all, but if you’re on PC browser, you can hover on where the token is supposed to be and still see the total tokens so far, and no new content in that conversation is being saved (if you click on the setting three vertical … in the top right corner, you’ll see it’s not getting saved because the system is stuck “Please wait for the content to finish loading”).

Então eu também tenho esse problema, mas não com tokens e sim com a conversa que não salva. Isso começou já tem um tempinho umas semanas mais ou menos pra mim. A minha sugestão é : Enquanto vc não desligar o PC, peça para o AI Studio um resumo do que vc pesquisou com ele, de tudo desde o primeiro momento se puder. Durante alguns dias vc usou e começou a dar problemas certo? Então até ali tem conteúdo vc pede para ele te enviar um resumão de tudo que falaram., aí vc copia e cola no bloco de notas e salva. Qdo iniciar o PC no dia seguinte, use o resumo, na mesma janela.

That won’t work, there is a lot of coding, though the summary will give you a better than nothing solution, all the code changes are lost, and these days I do a ton of coding in only a few hours. We simply need a solution to this problem. Right now i=my token usage is simply flashing I have no idea why or how many tokens are used, started only after I only used 160k far from the limit of 1 million. I did not lose what i had yesterday, but it is a terrible way to go forward, even branching is not guaranteed most like some corrupted meta data causes this and branching just copies the errors so most likely the branch will not show the changes either.

1 Like