I see. I use NodeJS 16, with âopenaiâ: â^4.73.1â.
I modified my locole version of core.js so I can debug code execution. This is what is now logged in the console:
It is going to the right URL
https://generativelanguage.googleapis.com/v1beta/openai/chat/completions
The path â/chat/completionsâ is correct.
OpenAI:DEBUG:request https://generativelanguage.googleapis.com/v1beta/openai/chat/completions {
method: 'post',
path: '/chat/completions',
body: {
model: 'gemini-1.5-flash',
messages: [
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object],
[Object], [Object], [Object], [Object]
],
tools: [
[Object], [Object], [Object],
[Object], [Object], [Object],
[Object], [Object], [Object],
[Object], [Object], [Object],
[Object], [Object], [Object],
[Object], [Object], [Object],
[Object], [Object], [Object]
],
n: 1,
user: '65cfebc32f5e1b37d4e52327',
frequency_penalty: 0,
presence_penalty: 0,
temperature: 0.2,
stream: true,
stream_options: { include_usage: true },
tool_choice: 'auto'
},
stream: true
} {
'content-length': '64504',
accept: 'application/json',
'content-type': 'application/json',
'user-agent': 'OpenAI/JS 4.73.1',
'x-stainless-lang': 'js',
'x-stainless-package-version': '4.73.1',
'x-stainless-os': 'Windows',
'x-stainless-arch': 'x64',
'x-stainless-runtime': 'node',
'x-stainless-runtime-version': 'v16.20.2',
authorization: 'Bearer xxxxxx',
'x-stainless-retry-count': '0'
}
OpenAI:DEBUG:response (error; (error; no more retries left)) 400 https://generativelanguage.googleapis.com/v1beta/openai/chat/completions {
'alt-svc': 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000',
'content-encoding': 'gzip',
'content-type': 'application/json; charset=UTF-8',
date: 'Tue, 26 Nov 2024 15:06:24 GMT',
server: 'scaffolding on HTTPServer2',
'server-timing': 'gfet4t7; dur=374',
'transfer-encoding': 'chunked',
vary: 'Origin, X-Origin, Referer',
'x-content-type-options': 'nosniff',
'x-frame-options': 'SAMEORIGIN',
'x-xss-protection': '0'
} undefined
PRIA LOGIC ERROR BadRequestError: 400 status code (no body)
at Function.generate (C:\_svn\_praxis\code\pria_ui\pria-ui-1\node_modules\openai\error.js:45:20)
at OpenAI.makeStatusError (C:\_svn\_praxis\code\pria_ui\pria-ui-1\node_modules\openai\core.js:293:33)
at OpenAI.makeRequest (C:\_svn\_praxis\code\pria_ui\pria-ui-1\node_modules\openai\core.js:337:30)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async doStreamConversation (C:\_svn\_praxis\code\pria_ui\pria-ui-1\routes\middlewares\openai.js:441:28)
at async Object.streamConversation (C:\_svn\_praxis\code\pria_ui\pria-ui-1\routes\middlewares\openai.js:386:5)
at async rag.chat (C:\_svn\_praxis\code\pria_ui\pria-ui-1\routes\middlewares\rag.js:484:13)
at async C:\_svn\_praxis\code\pria_ui\pria-ui-1\routes\ai\personal\qanda.js:90:3 {
status: 400,
headers: {
'alt-svc': 'h3=":443"; ma=2592000,h3-29=":443"; ma=2592000',
'content-encoding': 'gzip',
'content-type': 'application/json; charset=UTF-8',
date: 'Tue, 26 Nov 2024 15:06:24 GMT',
server: 'scaffolding on HTTPServer2',
'server-timing': 'gfet4t7; dur=374',
'transfer-encoding': 'chunked',
vary: 'Origin, X-Origin, Referer',
'x-content-type-options': 'nosniff',
'x-frame-options': 'SAMEORIGIN',
'x-xss-protection': '0'
},
request_id: undefined,
error: undefined,
code: undefined,
param: undefined,
type: undefined
}
Updates:
I was able to make the library work until I removed from following properties the json
- frequency_penalty,
- presence_penalty,
- tool_choice, and
- user
Additionally, the library wonât work either if you declare more than 1 too in the tools section
"tools": [
{
"type": "function",
"function": {
"name": "get_browser",
"description": "You have the 'get_browser' tool. Use 'get_browser' tool to search the web for up-to-date information ",
"parameters": {
"type": "object",
"properties": {
"argument_1": {
"type": "string",
"description": "The specific information, event, or topic to search (e.g. Latest financial news)."
}
},
"required": [
"argument_1"
]
}
}
},
{
"type": "function",
"function": {
"name": "get_foo",
"description": "You have the 'get_foo' tool. Use 'get_foo' tool to get the users location",
"parameters": {
"type": "object",
"properties": {
"argument_2": {
"type": "string",
"description": "The specific location."
}
},
"required": [
"argument_2"
]
}
}
}
],
Finally I suggest to enhance the API libs to communicate the reason for error, instead of an opaque 400.
Where can we log these bugs so they get addressed in future versions of openAI lib or Gemini ?
Thank you for all your help