I was using gemini-2.0-flash-exp through OpenAI library and encountered several issues, one of them being that when I pass tool call results back to the LLM and define the "role"
parameter as "tool"
it returns the following error:
API error: Error code: 400 -
[{'error': {'code': 400, 'message': 'Request contains an invalid argument.', 'status': 'INVALID_ARGUMENT'}}]"
Then I tried with "role" : "user"
and passed the tool results the same way and it worked fine.
Other LLM providers handle it fine, but I found Gemini API to have several issues, especially with OpenAI library.