Google Cloud Gemini LLM Batching running but never completing

For the last few days including today after the new google cloud terms of services, it’s impossible to get the batches sent through google cloud to finish out of the runnig states, for any preview model (3.1 or 2.5-sep-2025). Is there a way to monitor the available resourced overall, to see why the provider is overwhelmed? Any similar issues elsewhere before switching to another provider?

2 Likes