Conversation history in self.client.models.generate_content in google.gen-ai

I was able to add conversation history in the form of json to the prompt and pass to generate_content in google.generativeai SDK. but with new SDK and 2.0 models it does not accept it. I want to give the content in this format : […{‘role’: ‘model’, ‘parts’: ‘’}, {‘role’: ‘user’, ‘parts’: ‘’}, {‘role’: ‘model’, ‘parts’: […]}, {‘role’: ‘function’, ‘parts’: […]}…] .

I handle the history myself as I want to switch between different LLM backend.

The new SDK gives error : “Extra inputs are not permitted [type=extra_forbidden, input_value=‘model’, input_type=str]
For further information visit Redirecting...
contents.list[union[File,Part,is-instance[Image],str]].69.Part.parts…”

Hi @Ray_Ka , Welcome to the forum.

Have you tried using the chat function that allows you to explicitly pass the chat history? Your use case seems to align well with this feature.

chat = client.chats.create(
    model="gemini-2.0-flash",
    history = history_chat
)
1 Like

Thanks Gunand!
The chat history I have is complex. it has function calls and function responses to reinforce the model on proper tools and function calls. Also, I have several functions which I pass to the generate_content. the chat history is stored separately and can be used in multiple models. here is snippet of the code:

tools = [Tool(function_declarations=functions_list)]

        config = GenerateContentConfig(
            system_instruction=self.system_instruction,
            tools=tools,
            response_modalities=["TEXT"],
        )

        response = self.client.models.generate_content(
            model=self.gemini_model,
            contents=global_messages,
            config=config
        )    

— Does chats.create support this format ?

Note, the following code was working in the past using previous sdk:

gemini_functions = [
{
“function_declarations”: functions_list
}
]
response = self.gemini.generate_content(global_messages, generation_config=self.generation_config, tools=gemini_functions)

can you check the function calling cookbook available in the quickstart tutorial? Specifically, refer the manual function calling section, where an example shows how to pass the function response back to the model. For more examples, you can also explore the function calling documentation.

Let me know if this helps or if you are looking for something else.

The issue is not function calling. It is chat history. My question is in what format I can provide the chat history which includes user, model and function roles to generate_content API in the new sdk.

Hi Gunand,
Anyone from Google can help here ?

Hey, could you provide a sample working code using the older version of the google.generativeai SDK?