AI Studio becomes ridiculously slow and lags When tokens reach 50K . Your local INP value of 48,488 ms is poor

Dear All and to whom it may concern,
Greetings.
Google AI Studio becomes ridiculously slow and lags to the point where you cannot even type properly let alone waiting for an answer to generate. This happens after some tokens reach 50K onwards. This happens in every new prompt. It works quickly initially but then becomes unbearable as the context grows.

Interaction to Next Paint (INP)

48,488 ms

Your local INP value of 48,488 ms is poor.

logs


3 Likes

https://www.reddit.com/r/Bard/comments/1jpt9ux/for_google_devs_ai_studio_lag_likely_causes_tldr/

If any Google devs happen to be lurking, just wanted to drop a few notes that might help debug this issue. Here’s what I’ve been seeing:

The Issue:

  • Main problem: The UI starts lagging really badly as the chat gets longer. It doesn’t feel linear - more like exponential slowdown.
  • What happens: Typing gets super delayed (2-3 seconds input lag at first, 10-15 seconds later as the chat keeps growing), and buttons (Send/Run) take a while to respond after clicking.
  • What triggers it: Seems tied to the total length of the conversation (user + AI messages over time), not just the size of the current message. Brand new chats feel fine.
  • Frontend issue?: The lag kicks in before a message is even sent (while typing) and happens no matter what model is selected, which makes it look like a frontend bottleneck.
  • Cross-platform: Reproducible on Windows/Mac across Chrome, Brave, Firefox, and on mobile (Safari on iOS, Chrome/Brave on Android).

What Might Be Causing It:

a) DOM Bloat (Most Likely Primary Cause):

Chrome dev tools show that DOM node count starts around 2-3k in a fresh chat, but blows up to 100k+, even 300k+ as the chat grows. There doesn’t seem to be a limit.

The more DOM nodes there are, the slower everything gets. Strong inverse correlation.

Typing triggers what looks like massive layout/repaint work across the entire DOM.

CPU usage also shoots up - It hits 100% on a decent machine just from typing in a long chat. Here’s a screenshot from Brave dev tools showing it: Imgur: The magic of the Internet.

My guess is the whole chat history is being rendered at once with no virtualization. That’s a lot of content for the browser to keep up with.

I think virtual scrolling is worth trying here.

b) Frequent countTokens Calls (Likely Contributing Factor):

I’ve noticed tons of countTokens (or similar) network requests firing constantly while typing - often looking like one per keypress.

While likely not the root cause of the exponential slowdown (which points to DOM), this constant network chatter during input definitely seems to contribute to the perceived input lag and sluggishness. Even if async, any latency or processing delay on these frequent calls can make the typing experience feel stuttery or unresponsive.

This might be exacerbating the slowdown caused by the DOM issues, especially as the main thread gets busier.

Could debouncing these calls (e.g., fire only after typing pauses for 250-500ms) and ensuring they are truly non-blocking help?

TL;DR:

Massive DOM size from rendering the full chat history is almost certainly the main issue causing the exponential slowdown (virtualization as a possible fix?). However, the very frequent token-counting network requests during typing likely exacerbate the problem and contribute significantly to the input lag.

2 Likes

console.group(‘[DevTools] Long animation frames for 60240ms keyboard interaction’);
console.log(‘Scripts:’);
console.table([{
“Blocking duration”: 59258,
“Invoker type”: “event-listener”,
“Invoker”: “MS-CHUNK-INPUT.onkeydown”,
“Function”: null,
“Source”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“Char position”: -1
}, {
“Blocking duration”: 838,
“Invoker type”: “user-callback”,
“Invoker”: “FrameRequestCallback”,
“Function”: null,
“Source”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“Char position”: -1
}]);
console.log(‘Intersecting long animation frame events:’, [{
“name”: “long-animation-frame”,
“entryType”: “long-animation-frame”,
“startTime”: 105897.70000000019,
“duration”: 60212.299999999814,
“renderStart”: 165170.2000000002,
“styleAndLayoutStart”: 166009.1000000001,
“firstUIEventTimestamp”: 105897.3999999999,
“blockingDuration”: 60149.215,
“scripts”: [{
“name”: “script”,
“entryType”: “script”,
“startTime”: 105898.20000000019,
“duration”: 59258,
“invoker”: “MS-CHUNK-INPUT.onkeydown”,
“invokerType”: “event-listener”,
“windowAttribution”: “self”,
“executionStart”: 105898.20000000019,
“forcedStyleAndLayoutDuration”: 0,
“pauseDuration”: 0,
“sourceURL”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“sourceFunctionName”: “”,
“sourceCharPosition”: -1
}, {
“name”: “script”,
“entryType”: “script”,
“startTime”: 165170.2999999998,
“duration”: 838,
“invoker”: “FrameRequestCallback”,
“invokerType”: “user-callback”,
“windowAttribution”: “self”,
“executionStart”: 165170.2999999998,
“forcedStyleAndLayoutDuration”: 0,
“pauseDuration”: 0,
“sourceURL”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“sourceFunctionName”: “”,
“sourceCharPosition”: -1
}]
}]);
console.groupEnd();

console.group(‘[DevTools] Long animation frames for 59944ms keyboard interaction’);
console.log(‘Scripts:’);
console.table([{
“Blocking duration”: 58968,
“Invoker type”: “event-listener”,
“Invoker”: “MS-CHUNK-INPUT.onkeydown”,
“Function”: null,
“Source”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“Char position”: -1
}, {
“Blocking duration”: 838,
“Invoker type”: “user-callback”,
“Invoker”: “FrameRequestCallback”,
“Function”: null,
“Source”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“Char position”: -1
}]);
console.log(‘Intersecting long animation frame events:’, [{
“name”: “long-animation-frame”,
“entryType”: “long-animation-frame”,
“startTime”: 105897.70000000019,
“duration”: 60212.299999999814,
“renderStart”: 165170.2000000002,
“styleAndLayoutStart”: 166009.1000000001,
“firstUIEventTimestamp”: 105897.3999999999,
“blockingDuration”: 60149.215,
“scripts”: [{
“name”: “script”,
“entryType”: “script”,
“startTime”: 105898.20000000019,
“duration”: 59258,
“invoker”: “MS-CHUNK-INPUT.onkeydown”,
“invokerType”: “event-listener”,
“windowAttribution”: “self”,
“executionStart”: 105898.20000000019,
“forcedStyleAndLayoutDuration”: 0,
“pauseDuration”: 0,
“sourceURL”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“sourceFunctionName”: “”,
“sourceCharPosition”: -1
}, {
“name”: “script”,
“entryType”: “script”,
“startTime”: 165170.2999999998,
“duration”: 838,
“invoker”: “FrameRequestCallback”,
“invokerType”: “user-callback”,
“windowAttribution”: “self”,
“executionStart”: 165170.2999999998,
“forcedStyleAndLayoutDuration”: 0,
“pauseDuration”: 0,
“sourceURL”: “https://www.gstatic.com/_/mss/boq-makersuite/_/js/k=boq-makersuite.MakerSuite.zh_CN.WxZmOVXuvH0.es5.O/am=kLZaMwAY/d=1/excm=_b/ed=1/dg=0/br=1/wt=2/ujg=1/rs=AMOXD2_ORugimq0H68vstfyECw5ScHZo1Q/m=_b”,
“sourceFunctionName”: “”,
“sourceCharPosition”: -1
}]
}]);
console.groupEnd();

1 Like

Hi @user1789, Welcome to the forum!!

Thanks for flagging and your detailed analysis. This issue is already escalated to our internal team. They are working to fix this. I will attach this report with current escalation. That helps.

Thanks

1 Like

The problem still exists.

Since this forum does not allow upload Trace files.

I uploaded the browser performance analysis Trace file to this website.

performance analysis Trace file “Modified by moderator”

If you need it, you can check it.

1 Like

This has been a huge problem for so long i am glad someone is addressing it with such seriousness, i’ve been using gemini studio from the very beginning and i can tell you that gemini stuidio used to be very smooth even when you had 50k+ amount of context, i wonder if its a resources issue since then i am sure the site has taken on more user. Hopefully the devs fixes it, its truly mind boggling that even tho its such an annoying problem this is the first time i am seeing someone address it here!

1 Like