OpenAI client compatibility for the Batch API

Both the OpenAI client and the Google genai clients have very similar interfaces for managing batches:

# Create file
uploaded_file = client.files.upload(
    file='my-batch-requests.jsonl',
    config=types.UploadFileConfig(display_name='my-batch-requests', mime_type='jsonl')
)

# Start batch
file_batch_job = client.batches.create(
    model="gemini-2.5-flash",
    src=uploaded_file.name,
    config={
        'display_name': "file-upload-job-1",
    },
)

https://platform.openai.com/docs/guides/batch

# Create file
batch_input_file = client.files.create(
    file=open("test_batch_subset.jsonl", "rb"),
    purpose="batch"
)

# Start batch
client.batches.create(
    input_file_id=batch_input_file.id,
    endpoint="/v1/chat/completions",
    completion_window="24h",
    metadata={
        "description": "RPMD Test Batch",
    }
)

Extending Gemini’s OpenAI compatibility to the batch APIs would be great!