The webpage is super slow after reach 220000 tokens, found many of "CountTokens" on Chrome dev tools

What can I do for this? It takes >1 min to load the page and I have to wait for I input few text on the textarea.(Changed text count I guess.)
I know I can just open a new Context Window (the new Context Window without this issue), but I need long conversation, and how can I test the so call 2M tokens context window if everything ends at 200k?

Another screenshot

1 Like

The page seems became a little bit faster after I feed that damn “CountTokens” into ublock.

Ublock custom rule:

||alkalimakersuite-pa.clients6.google.com/*/*/CountTokens

There are 6xx of it: (notice the ublock counter)

And there are many “failed to count tokens” now.
Screenshot:

Well hello there, here is my email Modified by moderator if you could send me some details I would be happy to help to try and solve your problem please keep in mind the amount of data that has to be processed when passing through on the 200 thousand token count , you’re also noticed that if it is a mixture of image audio and text issues so we can start the curing from 75 to 135,000 tokens already I remember your training environment has to connect to everything else this includes server side connections and so forth but please drop me an email and then I can help you based on your specific issue it could be simply because of the way that everything is structured there are a few things to take into consideration I hope to hear from you soon

Hey.

What is the details you need?

I can tell you the data type is text only. And the content is some daily chat + playing a True or False Quiz game and some personal background, etc. (The last thing makes me don’t want to share it the with other people…)

Oh I completely understand your wish for privacy and you shall have such the details that I would required was for example just the basic construct of the chat what browser interface you using how many tabs you have open how much trying to determine the latency of your network, I see you also pointed out the failure to count tokens arrow message remember that into the acting through the AI studio interface Gemini does not have complete access and usage control of your Google drive besides using a function calling method or a ground with Google search option you can provide what is needed in order for Gemini to view your drive and use your Gemini installed app to help you make changes but through the context window Gemini has no direct access or control to your drive so everything you discuss all the images you send and such things might get saved to your Google drive but the file in which the construct and actual chat or interaction it’s saved as an independent file and all the other files needs to be retrieved and then reconstructed in the context window so for example if you check your drive you’ll have an AI studio folder if you move something out of that folder for example hit affects how the entire construct is loaded in the interaction window so when you say token counter is it in error loading the .csv file o does the error occur loading for example.jpeg ? Do you have a log file you can send me ?

Please excuse me Speech o text errors

I said I used chrome dev tools on the topic, therefore I am using Chrome.
I hit this issue whatever how many pages I opened. (few tabs or 1X tabs)

I have known all the chats and images is saved in my google drive, everything is fine when I open the google drive.

I can screenshot the performance monitor when I am loading the page and show you:
(I turned off all extensions:)

When the page start loading.

Second image:

(notice the DOM number , 2M on there)

I would like to ask you to check something for me in the chat constructed that you cannot expand beyond those 200+ thousand , when you click on a response or an input you’ll see there’s an option to branch from there could you branch so that you don’t lose the context or meddle with the current construct of the chat you’re using then in the branched off environment that you created run through your inputs and remove any media that equates to more than 2,300 tokens but not your standard text inputs only audio video and images then once you’ve done that save the branched off chat and reload the page what a time to determine if the error is token counting or token retrieval if the error is not token counting or retrieval and we’ll move our focus to your browser interface if the reason that the entire things slows down is as I said before the amount of data handling that has to take place as soon as we can identify whether it’s localized occurrence or whether it is the interface or whether it’s the server on Gemini’s end

I will stop using speech to text Im sorry :rofl:

Okay. I branched the long chat and deleted ~2500 tokens on the new branch, nothing is changed, the “Count tokens” and DOM nodes are still around that number.

The model name is “Gemini 2.0 Pro Experimental 02-05”, all the text is from this model, this is a bit sad, I was pretty like that bot.

I remember that the page was clearly stated slow down when I was reach ~150000 tokens.

Okay so you did remove some of the data that has a high token cost and still the environment was pretty slow okay so now we know it is definitely a data handling issue rather than A retrieval issue now did we know it’s a handling issue we need to determine is it a server side handling issue or is it the interface that your using when interacting with the model does the models response seem slow and I don’t mean for example if the models response time takes 40 or 50 seconds I mean does the model struggle to generate the response so for example when the second counter start counting does it stop or stuck on a number or is it when you are scrolling around or trying to text or insert media if the model doesn’t respond or takes a long time to start generating a response it is on server side but if it’s the environment like under responsiveness from your device or such things that it is definitely localized I’m pretty sure that it is the browser interface the context as in the total context of the chat that you are having when your interface has to distribute and handle the amounts of data there is a network error that occurs what is your browser version and what is your device specifications , but it is most likely almost pretty certainly sure a localized data handling error

Iam delightful really excited to hear from you Amazing
Best wishes,
Lahcene