ClientError: got status: 429 Too Many Requests

So I was using gemini 2.5 API for last 2 month without any problem, but like a week ago I got this error out of nowhere. I don’t pay for API, but dashboard doesn’t say anything about my quota. I’ve also tried using other gmails and devices, but it didn’t help. I’ve also thought that model I’m using doesn’t exists anymore or something(bc I don’t see it on documentation anymore), but I still get the same error even after using another model from the docs. Here is my error and code

Error:

ClientError: got status: 429 Too Many Requests. {"error":{"message":"exception parsing response","code":429,"status":"Too Many Requests"}}

Code:

const handleGemini = async (prompt: string, schema: any, res: Response) => {
  const stream = await getGoogleAI().models.generateContentStream({
    model: "gemini-2.5-pro-exp-03-25", // This is the model I had been using all the time. I've also tried gemini-2.5-pro-preview-05-06
    contents: [{ text: prompt }],
    config: {
      responseMimeType: "application/json",
      responseSchema: schema,
      safetySettings: [
        { category: HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT, threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE },
        { category: HarmCategory.HARM_CATEGORY_HATE_SPEECH, threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE },
        { category: HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT, threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE },
        { category: HarmCategory.HARM_CATEGORY_HARASSMENT, threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE }
      ]
    },
  });
  for await (const chunk of stream) {
    if (chunk.text) {
      res.write(chunk.text);
    }
  }

  return res.end();
};

Hi @Denys,

Check once you are not sending too many requests per minute with the free tier Gemini API. Please have a look on rate limits and ensure you are well within this, otherwise we see 429 error code