2.5 Pro Api Unexpected interruption

When I use the 2.5 pro model, it frequently experiences unexpected interruptions without any error output and just crashes directly. I wonder if anyone else has encountered this issue, node v1beta.

code

#!/bin/bash
set -e -E

GEMINI_API_KEY=""
MODEL_ID="gemini-2.5-pro"
GENERATE_CONTENT_API="streamGenerateContent"

cat << EOF > request.json
{
    "contents": [
      {
        "role": "user",
        "parts": [
          {
            "text": "创建一个在线商城的全栈应用,包括数据库模型、前端页面和后端API。"
          },
        ]
      }
    ],
    "generationConfig": {
      "maxOutputTokens": 65536,
      "temperature": 0.8,
      "topP": 1,
      "thinkingConfig": {
         "includeThoughts": true,
         "thinkingBudget": 24576
      }
    },
    "safetySettings": [
        {
            "category": "HARM_CATEGORY_HARASSMENT",
            "threshold": "BLOCK_NONE"
        },
        {
            "category": "HARM_CATEGORY_HATE_SPEECH",
            "threshold": "BLOCK_NONE"
        },
        {
            "category": "HARM_CATEGORY_SEXUALLY_EXPLICIT",
            "threshold": "BLOCK_NONE"
        },
        {
            "category": "HARM_CATEGORY_DANGEROUS_CONTENT",
            "threshold": "BLOCK_NONE"
        },
    ],
}
EOF

curl \
-X POST \
-H "Content-Type: application/json" \
"https://generativelanguage.googleapis.com/v1beta/models/${MODEL_ID}:${GENERATE_CONTENT_API}?key=${GEMINI_API_KEY}" -d '@request.json'

output

{
  "candidates": [
    {
      "content": {
        "parts": [
          {
            "text": "\n    category: { type: String, required: true },\n    description: { type: String, required:"
          }
        ],
        "role": "model"
      },
      "index": 0,
      "safetyRatings": [
        {
          "category": "HARM_CATEGORY_SEXUALLY_EXPLICIT",
          "probability": "NEGLIGIBLE"
        },
        {
          "category": "HARM_CATEGORY_HATE_SPEECH",
          "probability": "NEGLIGIBLE"
        },
        {
          "category": "HARM_CATEGORY_HARASSMENT",
          "probability": "NEGLIGIBLE"
        },
        {
          "category": "HARM_CATEGORY_DANGEROUS_CONTENT",
          "probability": "NEGLIGIBLE"
        }
      ]
    }
  ],
  "usageMetadata": {
    "promptTokenCount": 19,
    "candidatesTokenCount": 869,
    "totalTokenCount": 3105,
    "promptTokensDetails": [
      {
        "modality": "TEXT",
        "tokenCount": 19
      }
    ],
    "thoughtsTokenCount": 2217
  },
  "modelVersion": "gemini-2.5-pro",
  "responseId": "0OeMaLKMDfHdqtsPhuzzmAQ"
}
,
{
  "error": {
    "code": 503,
    "message": "The model is overloaded. Please try again later.",
    "status": "UNAVAILABLE"
  }
}

Hi @user2351 Welcome to the community
Based on the error message you have shared it is a 503 error means the service may be temporarily overloaded or down. Please temporarily switch to another model and see if it works. The 503 issues are typically intermittent and often resolve themselves. If the issue still persists let us know .