Hello everyone,
I’m currently using the Gemini API through the OpenAI compatibility endpoint, and it’s been great for a smooth transition. I have a specific question regarding streaming responses.
In the OpenAI Chat Completions API, to receive the usage
object containing token counts in the final chunk of a stream, we must explicitly set the parameter stream_options: { "include_usage": true }
.
My question is: Does the Gemini OpenAI compatibility endpoint support or require this stream_options
parameter?
From my testing, it appears that the final stream chunk from the Gemini compatibility endpoint includes the usage
object by default, even when I do not pass the stream_options
parameter.
Here is a conceptual code snippet illustrating the parameter in question:
import OpenAI from 'openai';
// Assuming the client is configured for the Gemini endpoint
const client = new OpenAI({
apiKey: 'YOUR_GEMINI_API_KEY',
baseURL: 'https://generativelanguage.googleapis.com/v1beta', // or similar
});
async function main() {
const stream = await client.chat.completions.create({
model: 'models/gemini-1.5-pro-latest',
messages: [{ role: 'user', content: 'Write a short story.' }],
stream: true,
// This is the parameter in question
stream_options: { include_usage: true },
});
for await (const chunk of stream) {
// Process stream...
}
// The goal is to reliably get the final chunk with the 'usage' object.
}
main();
To be perfectly clear, my questions are:
- Does the Gemini OpenAI compatibility endpoint officially recognize the
stream_options: { "include_usage": true }
parameter? - Is including the
usage
object in the final stream chunk the default (and intended) behavior, making this parameter unnecessary when targeting Gemini? - If it is the default behavior, is it safe to rely on this, or would it be best practice to pass the parameter anyway for forward compatibility?
I couldn’t find specific documentation on the compatibility of this particular parameter. Any clarification on the official stance or a link to relevant docs would be much appreciated.
Thank you