It’s been a while now, and almost every regular Google Studio user has faced this issue—the longer the context or token length in a prompt window, the slower the entire interface becomes.
I’ve always wondered why Studio doesn’t handle this better. Instead of dumping all previous responses into the browser at once when a user reopens or continues a prompt after a refresh, it could wrap old responses and load them on demand. This would keep the UI snappy and responsive.
Another frustrating UI issue: when Gemini generates a response, the interface keeps snapping back to the first line, making it impossible to scroll and read at your own pace. I get that Google wanted fancy JavaScript scroll methods, but in this case, it feels more like a nuisance than a feature. Then the same thing throws you back to the very first response when you reopen a prompt on browser reload.
Please do something about it Google.