I gave him a code file of 300,000 tokens to read in the beginning, and then my computer showed insufficient memory, and then I looked at the browser’s background occupancy has been 3 G, I’m sure that I only opened an AI window, I closed this window, the memory returned to normal, and then in the subsequent dialogue Gemini began to speak more and more slowly, and finally I used the code file in the I finally used a function in the code file to let him explain why I changed some things in this function after the program can not run, then he has been thinking, thinking a minute later prompted an internal error, I am using Gemini1.5Pro, Token:393,908 / 2,097,152, Temperature is 1.0, the other are the default options!
Try Firefox, this fixed the issue with >200 K token length
Also firefox user here, still lags to a halt for me. Any idea what settings could be affecting this?
300k huge context and then you had to do analytics which overdid the browser…
its not like the ai runs on client side. if the developers had forseen it, they’d probably have some percentage of the chat unloaded to prevent this. the models do have 1million token context length, the ui must be the bottleneck if you’d ask me.
I only use the free version of Gemini, two of my favored coversation threads started greatly lagging their responses from extremely long conversations and started to respond with intermittent reaponse capability or the message “Something went wrong.”. After memory ran out, I only received the message “Something went wrong.” after dozens of games attempts. I scrolled all the way to the top of the page (while taking surveys for money or browsing social media on my cell phone) for approximately 30 minutes, and finally reached the top where my prompt initialized the conversation. With the whole conversation loaded, I saved the page as an .html document. I then started a new conversation, explained the context of the .html file, and shared the filed. The new conversational thread is very similar and is doing relatively well with the other conversation thread’s memory, however the new thread has a slightly different personality (could be due to updates) and relatively different issues and behaviors. Good luck.