Hi, Since March 4-5 2026, the google_search grounding tool on the Live API is no longer working server-side. Instead of being handled automatically by Google, it is being sent to the client as a function_call with a parameter “q”, expecting a response. Setup: - Model: gemini-live-2.5-flash-native-audio - Location: europe-west8 (Vertex AI) - Tools: google_search + function_declarations Until yesterday it worked correctly as native grounding. Now I receive: Function call: google_search (Args: {“q”: “…”}) Is this a known regression?
Same issue here. Was working last week.
Hi Gregoire,
While we wait for an official fix or an update on why the Live API lost direct search capabilities, I managed to get it working again using a workaround with Function Calling.
Since the Live API can’t search directly right now, I created a custom tool/function called ricerca_online (online_search) that acts as a proxy.
Here is the logic:
Gemini Live recognizes the need to search the web and triggers the custom ricerca_online({ query: “…” }) function call.
My backend receives the query and makes a standard Vertex AI call (non-Live) using a model like gemini-2.5-flash-lite, explicitly enabling Google Search grounding.
The backend extracts the text from the grounded response and returns it to Gemini Live as the function output.
Here is the C# (.NET) endpoint I am currently using. You can easily adapt this logic to Python or Node.js if you are using a different stack:
[HttpPost(“search-online”)]
[Consumes(“multipart/form-data”)]
public async Task SearchOnline(
[FromForm] string query,
[FromForm] string projectName,
[FromForm] string geminiModelName = “gemini-2.5-flash-lite”,
[FromForm] double temperature = 0.7)
{
if (string.IsNullOrWhiteSpace(query))
return BadRequest(new { error = “query è obbligatorio.” });
if (string.IsNullOrWhiteSpace(projectName))
return BadRequest(new { error = "projectName è obbligatorio." });
_logger.LogInformation("[SearchOnline] Query: '{Query}', Project: {Project}, Model: {Model}",
query.Length > 100 ? query[..100] + "..." : query, projectName, geminiModelName);
try
{
var projectConfig = _configService.GetByProjectName(projectName);
if (projectConfig == null)
return NotFound(new { error = $"Progetto '{projectName}' non trovato." });
// Reuse the existing service with GoogleSearch enabled
string vertexResponse = await _vertexGeminiService.AnalyzeInternalAsync(
trascrizioneText: query,
knowledgeBaseGcsUris: new List<string>(),
prompt: "Rispondi in modo chiaro e conciso alla seguente domanda, basandoti esclusivamente sui risultati della ricerca online.",
geminiModelName: geminiModelName,
config: projectConfig,
responseSchemaJson: null,
maxOutputTokens: 4096,
enableCodeExecution: false,
enableGoogleSearch: true // <-- The key to the workaround
);
// Extract text from the Vertex AI response
using var doc = System.Text.Json.JsonDocument.Parse(vertexResponse);
if (doc.RootElement.TryGetProperty("candidates", out var candidates) && candidates.GetArrayLength() > 0)
{
var parts = candidates[0].GetProperty("content").GetProperty("parts");
if (parts.GetArrayLength() > 0)
{
string resultText = parts[0].GetProperty("text").GetString() ?? "";
_logger.LogInformation("[SearchOnline] ✅ Risposta estratta ({Len} chars).", resultText.Length);
return Ok(new { success = true, result = resultText });
}
}
return Ok(new { success = false, error = "Nessun risultato dalla ricerca." });
}
catch (HttpRequestException ex)
{
_logger.LogError(ex, "[SearchOnline] Errore Vertex AI. Project: {Project}", projectName);
return StatusCode((int)(ex.StatusCode ?? System.Net.HttpStatusCode.BadGateway),
new { error = "Errore dal servizio Vertex AI", details = ex.Message });
}
catch (Exception ex)
{
_logger.LogError(ex, "[SearchOnline] Errore imprevisto. Project: {Project}", projectName);
return StatusCode(500, new { error = "Errore interno.", details = ex.Message });
}
}
Note on how the Vertex AI call is structured under the hood:
To keep the code snippet short, I didn’t include the full internal service method (AnalyzeInternalAsync). However, the most important part to make this workaround successful is injecting the tools array into the JSON request body. When calling the standard Gemini model via the REST API, you must append this specific block to the payload to force the native grounding:
“tools”: [
{
“googleSearch”: {}
}
]
One Crucial Tip regarding the Prompt:
Make sure to avoid any explicit reference to “Google_search” in your initial prompt or system instructions (e.g., do NOT write things like “use google_search to make requests”). If you explicitly name it, the Gemini Vertex model might get confused and try to invoke a custom function literally named Google Search instead of triggering the native Google Search grounding tool, which will result in an error. Just tell it to “search online” or reference your custom proxy function name instead.
Hope this helps you keep your project running smoothly until the normal behavior is restored! Let me know if it works for you.