Feedback: AI Studio UI Performance with Long Chat Histories

Hello AI Studio Team & Community,

I have noticed that when a single chat history becomes quite extensive (for example, exceeding 100,000 tokens), the interface tends to become less responsive. This includes noticeable lag and reduced frame rates (FPS), which seems particularly evident while the AI is processing and generating its response.

This behaviour can affect the smoothness of interaction when working with these longer chat sessions.

I am interested to know if other users might have experienced similar performance characteristics when dealing with very long chat histories. Understanding if this is a shared experience would be helpful.

Thank you

Hi @grav,

Which model you are using currently?? I saw some reports with same issue but once the token count reaches near the context window limit. It’s already escalated with the team and they are working on it.
Also, check from your side like internet connection or else which might affect the performance

Thanks.