I was able to add conversation history in the form of json to the prompt and pass to generate_content in google.generativeai SDK. but with new SDK and 2.0 models it does not accept it. I want to give the content in this format : […{‘role’: ‘model’, ‘parts’: ‘’}, {‘role’: ‘user’, ‘parts’: ‘’}, {‘role’: ‘model’, ‘parts’: […]}, {‘role’: ‘function’, ‘parts’: […]}…] .
I handle the history myself as I want to switch between different LLM backend.
The new SDK gives error : “Extra inputs are not permitted [type=extra_forbidden, input_value=‘model’, input_type=str]
For further information visit Redirecting...
contents.list[union[File,Part,is-instance[Image],str]].69.Part.parts…”
Thanks Gunand!
The chat history I have is complex. it has function calls and function responses to reinforce the model on proper tools and function calls. Also, I have several functions which I pass to the generate_content. the chat history is stored separately and can be used in multiple models. here is snippet of the code:
can you check the function calling cookbook available in the quickstart tutorial? Specifically, refer the manual function calling section, where an example shows how to pass the function response back to the model. For more examples, you can also explore the function calling documentation.
Let me know if this helps or if you are looking for something else.
The issue is not function calling. It is chat history. My question is in what format I can provide the chat history which includes user, model and function roles to generate_content API in the new sdk.