I have been using ai studio for over a day now building a website and I have realized that after so many prompts it is now taking over 500 seconds to complete anything with 3.1 pro.
Has there been any work around to this?
I was able to go into the code yesterday and fix the issue i was having with the β revision errors β but after that i have really had nothing but problems.
2 Likes
@pushgit estou hΓ‘ 3 dias com o mesmo problema, os meus apps demoram cerca de 600 segundos para criar uma imagem, antes isto era feito em 5 segundos, Γ© irritante o tanto de problemas todos os dias!
500 seconds is quick.
I have had many times where it goes to 1200 seconds and times out. However, it has improved a lot lately, where 500 seconds is much faster.
As far as I understand, the speed at which the model runs is based off how much you are asking of it and how much compute is readily available on Googleβs server-end.
This is a known issue with long conversations in AI Studio β the context window grows with every message, and after many prompts the model has to process a huge amount of tokens each time, which is what slows things down.
A few things that actually help:
- Start a new chat β this is the quickest fix. A fresh conversation has no history to process.
-
- Reduce context β if you need continuity, try summarizing the conversation and starting fresh with just the summary as the system prompt.
-
- Switch models β Flash models (like Gemini 2.0 Flash) are significantly faster than Pro and usually good enough for iterative coding tasks.
-
- Try off-peak hours β server load does affect response time, especially on the free tier.
The 500-second delays are usually context length + server load combined. Starting fresh typically cuts it back to a few seconds.