Hello AI Studio Team & Community,
I have noticed that when a single chat history becomes quite extensive (for example, exceeding 100,000 tokens), the interface tends to become less responsive. This includes noticeable lag and reduced frame rates (FPS), which seems particularly evident while the AI is processing and generating its response.
This behaviour can affect the smoothness of interaction when working with these longer chat sessions.
I am interested to know if other users might have experienced similar performance characteristics when dealing with very long chat histories. Understanding if this is a shared experience would be helpful.
Thank you