Get logprobs at output token level

Hi there,

Have we ever got a solution or reasoning as to why it always returns "Logprobs is not supported for the current model." for seemingly every model I try when I call with Google’s GenAI library (not OpenAI’s SDK)?

Here is the minimal example using pythons google-genai library:

from google import genai
import os

# create client
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))

response = client.models.generate_content(
    model='gemini-2.0-flash',
    contents='What type of food is a tomato?',
    config={
        'response_mime_type': 'application/json',
        'response_logprobs': True
    },
)

Which returns ClientError: 400 INVALID_ARGUMENT. {'error': {'code': 400, 'message': 'Logprobs is not supported for the current model.', 'status': 'INVALID_ARGUMENT'}}

Do we also know which models are meant to support this?

Thanks in advance.