Gemini API returns an error when trying to pass tool call results with "role" : "tool"

I was using gemini-2.0-flash-exp through OpenAI library and encountered several issues, one of them being that when I pass tool call results back to the LLM and define the "role" parameter as "tool" it returns the following error:

API error: Error code: 400 -

[{'error': {'code': 400, 'message': 'Request contains an invalid argument.', 'status': 'INVALID_ARGUMENT'}}]"

Then I tried with "role" : "user" and passed the tool results the same way and it worked fine.

Other LLM providers handle it fine, but I found Gemini API to have several issues, especially with OpenAI library.

Hey @vaeho , I tried passing the function call response back to the gemini-2.0-flash-exp model with "role": "tool", and it is working as expected. Here is the colab gist file.