Batch API request with different models for each `GenerateContentRequest`?

The batch mode API requires you add a model parameter (https://ai.google.dev/api/batch-api#google.ai.generativelanguage.v1beta.BatchService.BatchGenerateContent), but the request in batch.inputConfig also accepts a model parameter (https://ai.google.dev/api/batch-api#GenerateContentRequest). Which model will it actually use?

Is it possible to declare a different model per inline request in the batch API?

Hi @hev09cphid,

Welcome to the community!

No, you cannot use different models per inline requests, as you are defining in when you are creating inline batch job request Batch API Inline requests

If you are using the same request, the model parameter in the URL is the one that will actually be used. Because in Batch API, the execution model is defined at the Job Level via the endpoint URL

Thank you!