BUG: Context Caching blocked (max_total_token_count=0) on Paid Tier 1 project

Hello Gemini API Team,

I am reporting a critical provisioning issue where Context Caching is completely disabled for my project, despite having an active Paid Tier (Price Level 1) account with a positive balance.

The Problem: Every attempt to create a cached content object fails with the following error: "error": {"code": 400, "message": "Cached content is too large. total_token_count=6525109, max_total_token_count=0", "status": "INVALID_ARGUMENT"}

Key Observations:

  • My project reports a max_total_token_count of 0 for all models (including gemini-1.5-flash-001 and gemini-3-flash).

  • I am on the Paid Tier. Billing is confirmed and active in Google AI Studio.

  • This seems to be a synchronization error between the Cloud Billing status and the API quota enforcement.

Project Details:

  • Region: Global / EU

As I am using this for my professional legal guardian office, I rely on the caching feature to handle large documentation. Could you please check why the cache quota is hardcoded to 0 for this project and force a resync?

Thank you for your help!