According to the Google Developer Knowledge Base (accessed via the official MCP server), Antigravity currently does not support the MCP OAuth specifications. This means we cannot use standard Client IDs and Secrets to authenticate connections to remote Google Cloud MCP servers—the tools show up in the IDE, but authentication fails.
While we wait for the engineering team to add native OAuth support to the Antigravity extension, I wanted to share two workarounds I’ve been using that completely bypass this limitation.
Workaround 1: The API Key Bypass (For Supported Servers)
If the remote MCP server supports alternative authentication, you can entirely bypass the OAuth handshake by passing an API key directly in the Antigravity IDE configuration.
For example, when connecting to the Google Developer Knowledge MCP Server, I simply generated an API Key in my Google Cloud project and provided it in the IDE’s MCP settings instead of trying to configure a Client ID. Antigravity connects natively, and the search_documents and get_documents tools become immediately available to the agent context.
Workaround 2: The gcloud Authorized Bridge Script (For OAuth-Strict Servers)
Prerequisites for this method:
- You must have the Google Cloud CLI (
gcloud) installed on the host machine executing the script. - You must authenticate your terminal session beforehand by running
gcloud auth loginandgcloud config set project <YOUR_PROJECT_ID>. - Permissions & Resources:
- For the Developer Knowledge Server: You only need a valid Google Cloud project with the necessary API enabled (e.g.,
developerknowledge.googleapis.comfor the Developer Knowledge Server). - For Infrastructure Servers (Compute, BigQuery, Storage): The account you log in with must have the corresponding IAM roles (e.g.,
Compute Viewer,BigQuery Data Viewer) for the target project. Your script proxy will seamlessly pass these IAM credentials to the MCP server.
- For the Developer Knowledge Server: You only need a valid Google Cloud project with the necessary API enabled (e.g.,
For other Google Cloud MCP servers that strictly require OAuth and won’t accept an API key, we can’t authenticate natively in the IDE. However, you can create a local proxy script that fetches a fresh OAuth token directly from your gcloud CLI environment and passes it to the generic mcp-remote HTTP-to-stdio converter.
Crucial Security Step: Securing the mcp-remote package Before we create the proxy script, we need to eliminate a potential supply-chain vulnerability. We will avoid using npx -y mcp-remote, which dynamically downloads and executes the absolute “latest” matching version from the open NPM registry every time the script executes. If that package is ever compromised upstream, it could hijack your credentials.
To lock this down securely, bypass the dynamic npx execution entirely by installing a strictly verified version (for example, 0.1.38) permanently on your local/remote host machine:
npm install -g mcp-remote@0.1.38
By installing it globally, we can use the mcp-remote binary directly in our proxy shell scripts, immunizing our custom bridging solution against any future malicious NPM updates natively.
Now we can create a bridge script (e.g., kb_bridge.sh) in your workspace endpoint:
#!/bin/bash
# This script uses the gcloud authorized context to run the MCP server locally over stdio
# bypassing Antigravity's OAuth limitations.
TOKEN=$(gcloud auth print-access-token)
mcp-remote https://developerknowledge.googleapis.com/mcp --header "Authorization: Bearer $TOKEN"
(Just make sure to replace https://developerknowledge.googleapis.com/mcp with the actual HTTP endpoint of the server you’re targeting).
Make the script executable (chmod +x kb_bridge.sh). Then, instead of configuring the remote server directly in Antigravity’s settings, configure a new server pointing to this local script.
How to configure Antigravity to use your script
If you are running Antigravity natively or via a remote connection (like SSH / WSL), the IDE stores the MCP configuration file directly on the host machine at: ~/.gemini/antigravity/mcp_config.json
Open that file and add a new entry for your proxy script. Your JSON should look like this example:
{
"mcpServers": {
"google-cloud-proxy-server": {
"command": "/bin/bash",
"args": [
"/absolute/path/to/your/kb_bridge.sh"
],
"env": {}
}
}
}
Once saved, Antigravity will automatically reload the servers. It will execute the bash script natively, grab a valid OAuth access token from your active gcloud session, and securely connect over standard purely-local MCP I/O—bypassing the IDE’s OAuth limitations completely.
Bonus: Adapting this Proxy Pattern for Any Cloud Provider
While Workaround 2 specifically tackles Google Cloud MCP servers, this exact bash-proxy structure is actually a universal workaround for the Antigravity OAuth limitation.
Because Antigravity cannot currently render the interactive pop-ups and browser redirects mandated by the official MCP OAuth specification, we are simply pushing that responsibility down to our host operating system. As long as your local terminal can fetch a valid authentication token, you can bridge any remote MCP server (AWS, Azure, or internal corporate tools) via the mcp-remote NPM package.
All you need to do is swap out the CLI command that dynamically fetches the token in your bash script.
For Example (Azure MCP Servers):
#!/bin/bash
# Fetch an Azure bearer token dynamically using the Azure CLI
TOKEN=$(az account get-access-token --query accessToken --output tsv)
mcp-remote https://your-azure-server.example.com/mcp --header "Authorization: Bearer $TOKEN"
For Example (Internal Corporate Auth Tools):
#!/bin/bash
# Fetch a short-lived token from an internal company auth CLI
TOKEN=$(my-company-cli login --print-token)
mcp-remote https://internal-mcp-server.example.internal/mcp --header "Authorization: Bearer $TOKEN"
By pointing Antigravity to locally execute these proxy scripts, you completely bypass the IDE’s authentication limitations across your entire tech stack.
Security Risks associated with these Workarounds
Whether you use Workaround 1 (providing an API key via the Antigravity UI) or Workaround 2 (the proxy script), there are significant security implications you must be aware of.
When you look under the hood at the ~/.gemini/antigravity/mcp_config.json file, it reveals that Antigravity’s “native” UI doesn’t actually contain a secure, in-memory MCP client. Instead, the IDE simply writes your credentials to disk in plaintext and spawns a third-party NodeJS shell command (npx mcp-remote) to run the server.
Because both workarounds rely on this shell-spawning architecture, they share these risks:
- Credential Exfiltration via Process Lists (The Biggest Risk): Because the
mcp-remotecommand is executing via a bash parameter (--header "Authorization: Bearer $TOKEN"or--header "X-Goog-Api-Key:..."), your live access token/API key is injected directly into your machine’s running command list. Anyone who can runps auxwhile your IDE is connected will be able to read your highly privileged credentials in plain text. - Plaintext Storage on Disk: If using Workaround 1, your permanent, non-expiring API key is saved as raw plaintext in the
mcp_config.jsonfile. Any user with read access to your home directory can hijack it. - Strict HTTPS Requirement: Because the authentication header is appended manually in the command line config, you must absolutely ensure your target MCP server URL uses
https://. Connecting to anhttp://endpoint will blast your credentials across the open internet.
Ultimately, Workaround 2 (the bash proxy) is definitively the lower-risk mechanism between the two. Because it generates an ephemeral OAuth token on the fly that expires within an hour, it completely avoids storing a permanent, highly sensitive API key in plaintext format directly on your disk.
How to carefully mitigate these risks: These workarounds are generally safe only if you are running the IDE on a local, personal, completely unshared host machine (like your own laptop). If you are using Antigravity connected to a shared remote Linux server, SSH jumpbox, or CI/CD runner, these methods expose your credentials to other users on that box!
Why we drastically need True Native Support
While these proxy scripts and API key fallbacks are functional, relying on unstructured shell wrappers and risking persistent token leaks in our system process lists is vastly unsuited for enterprise development—especially when we are dealing with high-stakes Cloud infrastructure permissions.
We need true, native MCP OAuth support embedded directly into the Antigravity application layer. Doing so will:
- Allow us to securely authenticate to strict OAuth servers natively without writing custom proxy scripts.
- Prevent our raw API Keys and OAuth tokens from being exposed in our shell environments (
ps aux) or plaintext JSON dotfiles. - Eliminate the reliance on unverified, third-party NPM proxy packages (
mcp-remote) downloaded on the fly.
If you are an engineer or PM working on Antigravity, please prioritize bringing this crucial, secure specification up to speed!