400 Error "Request contains an invalid argument" when calling v1beta/openai/chat/completions from AI Studio Build Environment

Hello everyone,

I’m encountering an issue when trying to use the OpenAI-compatible Chat Completions endpoint from within the AI Studio Build environment.

Objective:
My goal is to call the v1beta/openai/chat/completions endpoint using the OpenAI SDK from a client-side application running in the AI Studio’s “Build with Gemini” environment.

Steps to Reproduce:
I am using the openai npm package, configured to point to the Google Generative Language API base URL. Here is the code snippet I am running (Actually it is the example code):

import {marked} from 'marked';
import OpenAI from 'openai';

const GEMINI_API_KEY = process.env.GEMINI_API_KEY;

async function debug(...args: string[]) {
  const turn = document.createElement('div');
  const promises = args.map(async (arg) => await marked.parse(arg ?? ''));
  const strings = await Promise.all(promises);
  turn.innerHTML = strings.join('');
  document.body.append(turn);
}

const openai = new OpenAI({
  apiKey: GEMINI_API_KEY,
  baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai/',
  dangerouslyAllowBrowser: true,
});

const response = await openai.chat.completions.create({
  model: 'models/gemini-2.5-flash',
  messages: [
    {role: 'system', content: 'You are a helpful assistant.'},
    {
      role: 'user',
      content: 'Explain to me how AI works',
    },
  ],
});

debug(response.choices[0].message.content);

Expected Result:
The request should succeed with an HTTP 200 status code, returning a valid JSON response containing the AI-generated chat message.

Actual Result:
The request fails with an HTTP 400 (Bad Request) error.

When inspecting the network request in the browser’s developer console (F12), the raw response body is:

[,[3,"Request contains an invalid argument."]]

Additional Context:

  1. This issue appears to be specific to the AI Studio Build environment .

    1. It also appears that the failing request is routed through a different endpoint than successful ones. The failing request goes to https://alkalimakersuite-pa.clients6.google.com/$rpc/google.internal.alkali.applications.makersuite.v1.MakerSuiteService/ProxyStreamedCall, whereas successful requests go to https://alkalimakersuite-pa.clients6.google.com/$rpc/google.internal.alkali.applications.makersuite.v1.MakerSuiteService/ProxyUnaryCall.
  2. This leads me to believe the issue might be related to how requests from this specific endpoint are processed or proxied within the AI Studio environment.

Has anyone else in the community experienced this? Is this a known issue, or does anyone have suggestions for a potential workaround?

Any help or insight would be greatly appreciated.

Thank you

@Lalit_Kumar @Mrinal_Ghosh @Govind_Keshari @GUNAND_MAYANGLAMBAM
Sorry for tagging directly here, but I’ve been stuck on this issue for over a week. Do you have any insights?

@Lalit_Kumar @Krish_Varnakavi1 @Mrinal_Ghosh Do you have any insights on this? It’s really strange.

Hi @Ystone ,

Our internal team is reviewing your issue, we will update you soon.

Thank you!

1 Like

Thanks man. This bug is consistently reproducible – really appreciate everything your team’s done.

Hi @Ystone I have tested the code on my side and it’s working fine. I have shared the code—please check on your side

import OpenAI from "openai";
import "dotenv/config";


const GEMINI_API_KEY = process.env.GEMINI_API_KEY;
if (!GEMINI_API_KEY) throw new Error("Missing GEMINI_API_KEY");

const openai = new OpenAI({
    apiKey: GEMINI_API_KEY,
    baseURL: "https://generativelanguage.googleapis.com/v1beta/openai/"
});

const response = await openai.chat.completions.create({
    model: "gemini-2.5-flash",
    messages: [
        { role: "system", content: "You are a helpful assistant." },
        {
            role: "user",
            content: "Explain to me how AI works",
        },
    ],
});

console.log(response.choices[0].message.content);

1 Like

Where are you running this code? Is it in AI Studio Build? I tried executing your code snippet but still encountered the same error

@Sivasankaran_R

As mentioned in my post, this bug only occurs in the AI Studio Build environment. Maybe try running the code in AI Studio Build - you should be able to reproduce the error quickly. By the way, based on my observation, it seems the issue lies with "
https://alkalimakersuite-pa.clients6.google.com/$rpc/google.internal.alkali.applications.makersuite.v1.MakerSuiteService/ProxyStreamedCall" endpoint @Sivasankaran_R