Large context windows slow down the whole studio

It’s been a while now, and almost every regular Google Studio user has faced this issue—the longer the context or token length in a prompt window, the slower the entire interface becomes.

I’ve always wondered why Studio doesn’t handle this better. Instead of dumping all previous responses into the browser at once when a user reopens or continues a prompt after a refresh, it could wrap old responses and load them on demand. This would keep the UI snappy and responsive.

Another frustrating UI issue: when Gemini generates a response, the interface keeps snapping back to the first line, making it impossible to scroll and read at your own pace. I get that Google wanted fancy JavaScript scroll methods, but in this case, it feels more like a nuisance than a feature. Then the same thing throws you back to the very first response when you reopen a prompt on browser reload.

Please do something about it Google.

18 Likes

Same.new fixes this by starting a new chat where you can add context for a section in the chat or by your overall project starting new but still holding on to what was happening allowing you to go back into the previous chats also

Its not a browser prblm,it happens when conversation is too long, the website become laggy and slow, its a google issue, so my solution is You go to the file saved in your Google Drive automatically (named after your conversation), download it, then edit it and save it as a .txt file. After that, upload it to a new conversation in Google AI Studio. It contains all the context, and now you have a lag-free text field.

5 Likes

Wow, going to try this! TY

Yep…It is true! It’s happen to me too