I am experimenting with the use of OpenAI library to communicate with Gemini and follow the documentation at
When using the openai libraries on NodeJS and executing a simple await openai.models.list() I receive
generativelanguage.googleapis.com
/v1beta/openai/
404 status code (no body) NotFoundError: 404 status code (no body)
/v1beta/
401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential.
I supply a valid gemini API key I generated for personal use.
I wonder if anyone else has the same issue or has recommendations to resolve this.
Thank you
Hi there,
IIRC, the list models functionality is not available for the moment. For now, chat completions and embeddings are working.
Thank you for your response. I assume that /v1beta/openai/ is the proper end point then.
Regarding the streaming, is this endpoint supported?
openai.beta.chat.completions.stream(inboundS);
If not, do you have a full sample of how to use Streaming with tools (and respond to tool use)
On a side note, I am trying this simple code:
const completion = await openai.chat.completions.create({
model: engine,
user: user._id,
messages: [
{ role: "system", content: [{ type: "text", text: systemPrompt}] },
{ role: "user", content: [{ type: "text", text:userPrompt}] },
],
temperature: 0.3,
response_format: { "type": "json_object" }
})
And getting
BadRequestError: 400 status code (no body)
Any help is appreciated
Thank you for all your help