Hello everyone,
I’m encountering an issue when trying to use the OpenAI-compatible Chat Completions endpoint from within the AI Studio Build environment.
Objective:
My goal is to call the v1beta/openai/chat/completions endpoint using the OpenAI SDK from a client-side application running in the AI Studio’s “Build with Gemini” environment.
Steps to Reproduce:
I am using the openai npm package, configured to point to the Google Generative Language API base URL. Here is the code snippet I am running (Actually it is the example code):
import {marked} from 'marked';
import OpenAI from 'openai';
const GEMINI_API_KEY = process.env.GEMINI_API_KEY;
async function debug(...args: string[]) {
const turn = document.createElement('div');
const promises = args.map(async (arg) => await marked.parse(arg ?? ''));
const strings = await Promise.all(promises);
turn.innerHTML = strings.join('');
document.body.append(turn);
}
const openai = new OpenAI({
apiKey: GEMINI_API_KEY,
baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai/',
dangerouslyAllowBrowser: true,
});
const response = await openai.chat.completions.create({
model: 'models/gemini-2.5-flash',
messages: [
{role: 'system', content: 'You are a helpful assistant.'},
{
role: 'user',
content: 'Explain to me how AI works',
},
],
});
debug(response.choices[0].message.content);
Expected Result:
The request should succeed with an HTTP 200 status code, returning a valid JSON response containing the AI-generated chat message.
Actual Result:
The request fails with an HTTP 400 (Bad Request) error.
When inspecting the network request in the browser’s developer console (F12), the raw response body is:
[,[3,"Request contains an invalid argument."]]
Additional Context:
-
This issue appears to be specific to the AI Studio Build environment .
-
- It also appears that the failing request is routed through a different endpoint than successful ones. The failing request goes to
https://alkalimakersuite-pa.clients6.google.com/$rpc/google.internal.alkali.applications.makersuite.v1.MakerSuiteService/ProxyStreamedCall, whereas successful requests go tohttps://alkalimakersuite-pa.clients6.google.com/$rpc/google.internal.alkali.applications.makersuite.v1.MakerSuiteService/ProxyUnaryCall.
- It also appears that the failing request is routed through a different endpoint than successful ones. The failing request goes to
-
This leads me to believe the issue might be related to how requests from this specific endpoint are processed or proxied within the AI Studio environment.
Has anyone else in the community experienced this? Is this a known issue, or does anyone have suggestions for a potential workaround?
Any help or insight would be greatly appreciated.
Thank you




