Hello everyone,
I’m encountering an issue with the Gemini API that worked just fine yesterday, but today I get an error message: {‘error’: {‘code’: 400, ‘message’: ‘Logprobs is not enabled for models/gemini-2.5-flash’, ‘status’: ‘INVALID_ARGUMENT’}}
Here is structure of my request (prompts changed to dummy ones):
resp = g_client.models.generate_content(
model='gemini-2.5-flash',
contents="how are you?",
config=types.GenerateContentConfig(
system_instruction="you are a helpful assistant",
temperature=0,
response_logprobs=True,
logprobs=0,
response_mime_type="application/json",
),
)
Yesterday this worked: I got log probabilities in the response.
Today: same code, no change on my end, just the error above.
Questions:
Has anyone else seen the same error?
Is there a known change/announcement that logprobs support in gemini-2.5-flash was disabled or temporarily removed?
Are there model where logprobs are still supported ?
Any recommended workaround until this is resolved?
For context, I have been trying to use Gemini’s logprobs for almost a year now for different important projects. TLDR: It is very, very inconsistent. They will allow access sometimes. Then, it only works with Vertex AI. Then, it works with the standard gemini api. But wait, the package updates and the API usage is completely different. Then, there was a brief period of peace these past few months where it would work on both VertexAI and GenAI apis. Now, we are back to square one as far as I can see ….
Would appreciate if anyone has a solution that works.
response = client.models.generate_content( model=MODEl_ID, contents = "Why is the sky blue?", config = GenerateContentConfig( response_logprobs=True, logprobs=3 ) )
Logprobs galore. Guess it’s just available via Vertex AI? You probably need to enable “Vertex AI API” in your Google Console project you make too.
Might not be ideal depending on your auth setup, but if you’re at strong need for immediate logprobs (like I am), this 100% works on Colab for me (I was originally using the Gemini API w/ API key).