Error: [GoogleGenerativeAI Error] 400 Bad Request

Since this morning I’ve received this error message when calling gemini 1.5:

Error: [GoogleGenerativeAI Error]: Error fetching from https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash-latest:streamGenerateContent?alt=sse: [400 Bad Request] * GenerateContentRequest.contents[2].parts: contents.parts must not be empty.

Error: Error: [GoogleGenerativeAI Error]: Error fetching from https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash-latest:streamGenerateContent?alt=sse: [400 Bad Request] * GenerateContentRequest.contents[2].parts: contents.parts must not be empty.

at agentNode (/usr/src/packages/components/dist/nodes/sequentialagents/Agent/Agent.js:713:15)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async RunnableLambda.workerNode [as func] (/usr/src/packages/components/dist/nodes/sequentialagents/Agent/Agent.js:446:20)
at async /usr/src/node_modules/.pnpm/@langchain+core@0.3.18_openai@4.57.3_encoding@0.1.13_zod@3.22.4_/node_modules/@langchain/core/dist/runnables/base.cjs:1662:34

It was working fine before this morning. Nothing has changed my end.

@Mauricio_Mendez ? I appreciate the visibility, but I’m unsure why you posted my post here!

As for @Ryan_L , I can say that I am testing with both 1.5 flash and 2.0 flash, and both are working. I am using the node API.

Perhaps @Ryan_L , you could provide more details regarding how exactly you’re making the call?

@ibgib I’m using flowise AI to run a sequential agent workflow. It works fine when I call the LLM alone without any tools (see logs attahced)


. But when I trigger a workflow that contains langgraph agents I get the error, so I think it’s got something to do with function calling. As you can see, it worked fine and has successfully executed 100’s of calls before this morning. Let me know if you need any specific information.

I’m not a licensed google dev person! In trying to troubleshoot it though, I would say the problem is probably not with the function calling of the LLM per se, rather it sounds like a flowise-specific issue. Concretely, judging from the error, it sounds like their implementation is trying to call the Gemini API’s generate response via stream with no data.

So it sounds like it’s on their end, not the Gemini API’s. That’s just a guess though.

Hey @ibgib , sorry I thought you were! But I really appreciate the help! thanks

It turns out I’d reached the limit on a scraping API that my flow uses and this was causing the error. :grimacing:

1 Like

Hah. Would be nice to have a job with Google but I got one foot in my homeless tent as it is (due to my obsessive passion with my ibgib protocol - git has technical debt!).

As for your situation, there are a lot of moving parts: flowise lib (and indirectly their dependencies, especially if imported by repo/cdn address either in html or your particular language - some have it like deno), langgraph, each individual model instance, each model version (especially across multiple model vendors), as well as the Gemini API. Have you checked any updates to those other libs? Do you have logging set up for the dependencies?

Of course it could be this API, but AFAICT they’re at capacity for their free support. :roll_eyes:

Nice! Lots of moving parts these days. Interestingly enough, as I hadn’t seen your post that you had resolved it, I was looking into that lib since I’d never heard of it, and there was a change in that error’s Agent code from only 2 days ago that may also be related to why there is a new error manifesting.

let agentSystemPrompt = nodeData.inputs?.systemMessagePrompt as string
agentSystemPrompt = transformBracesWithColon(agentSystemPrompt) // added <------
let agentHumanPrompt = nodeData.inputs?.humanMessagePrompt as string
agentHumanPrompt = transformBracesWithColon(agentHumanPrompt) // added <------
const agentLabel = nodeData.inputs?.agentName as string
const sequentialNodes = nodeData.inputs?.sequentialNode as ISeqAgentNode[]
const maxIterations = nodeData.inputs?.maxIterations as string

You will see that the agent now composes its system prompt with a new function: transformBracesWithColon:

export const transformBracesWithColon = (input: string): string => {
    // This regex will match anything of the form `{ ... }` (no nested braces).
    // `[^{}]*` means: match any characters that are not `{` or `}` zero or more times.
    const regex = /\{([^{}]*?)\}/g

    return input.replace(regex, (match, groupContent) => {
        // groupContent is the text inside the braces `{ ... }`.

        if (groupContent.includes(':')) {
            // If there's a colon in the content, we turn { ... } into {{ ... }}
            // The match is the full string like: "{ answer: hello }"
            // groupContent is the inner part like: " answer: hello "
            return `{{${groupContent}}}`
        } else {
            // Otherwise, leave it as is
            return match
        }
    })
}

To my nose, this is a code smell function that is blindly “transforming” the payload - not a very descriptive name either, but naming things is hard! This is definitely one of those moving parts.

Regardless, you got it working…Just thought it interesting such a recent change with such a change!