Hello everyone,
I’m encountering an issue with the Gemini API that worked just fine yesterday, but today I get an error message:
{‘error’: {‘code’: 400, ‘message’: ‘Logprobs is not enabled for models/gemini-2.5-flash’, ‘status’: ‘INVALID_ARGUMENT’}}
Here is structure of my request (prompts changed to dummy ones):
resp = g_client.models.generate_content(
model='gemini-2.5-flash',
contents="how are you?",
config=types.GenerateContentConfig(
system_instruction="you are a helpful assistant",
temperature=0,
response_logprobs=True,
logprobs=0,
response_mime_type="application/json",
),
)
- Yesterday this worked: I got log probabilities in the response.
- Today: same code, no change on my end, just the error above.
Questions:
- Has anyone else seen the same error?
- Is there a known change/announcement that logprobs support in
gemini-2.5-flashwas disabled or temporarily removed? - Are there model where logprobs are still supported ?
- Any recommended workaround until this is resolved?
Thanks in advance for any help or insight!