We’re migrating some batch workloads from OpenAI to Gemini using the OpenAI compatibility layer. Demand (real-time) mode works perfectly with response_format: {“type”: “json_schema”, …}, but the Batch API rejects it.
What works in batch:
- response_format: {“type”: “json_object”} — accepted
- response_format: {“type”: “text”} — accepted
- No response_format — accepted
What fails in batch:
- response_format: {“type”: “json_schema”, “json_schema”: {…}} — fails with invalid JSON, near column N: no such field: ‘type’ (referring to “type”: “object” inside the schema definition)
- Any variation of json_schema (with/without strict, additionalProperties, name) — same error
- Gemini-native fields in the body (response_mime_type, response_schema) — no such field:
‘response_mime_type’
Our setup:
- File upload via google-genai SDK (client.files.upload())
- Batch creation via OpenAI SDK with base_url pointing to generativelanguage.googleapis.com/v1beta/openai/
- Model: gemini-2.5-flash
JSONL format (standard OpenAI batch format):
{“custom_id”:“req-1”,“method”:“POST”,“url”:“/v1/chat/completions”,“body”:{“model”:“gemini-2.5-flash”,“message
s”:[…],“temperature”:0,“response_format”:{“type”:“json_schema”,“json_schema”:{“name”:“my_schema”,“schema”:{
“type”:“object”,“properties”:{“color”:{“type”:“string”}},“required”:[“color”]}}}}}
Is json_schema structured output intentionally unsupported in the OpenAI-compat Batch API? If so, is there a roadmap for it? The workaround of using json_object + describing the schema in the prompt works but loses the schema validation.