Hi @KRows , thanks for the quick reply.
I have tried various ways of adding state to the conversation. The problem was in the past that if I include any metadata with an individual prompt entry, the model tries to mimic that metadata instead of just adding text. So I would type “add a box” but the model would see “[metadata here] add a box”. So they thought their responses should be “[hallucinated metadata] their response” However, now that I mention that, I realize I am trying a new method to constrain function mode to ANY
and including a tellUser
function, so perhaps this won’t manifest itself.
That said, I see you are including the metadata in the system prompt instead of the chat history. Is this kind of metadata associated with function calling supposed to go in the system prompt?
As for the “chat history”, I am currently using the node front end and have started using startChat
instead of generate which allows a
history` to be included. However, in looking at the raw request, this seems to do nothing special in terms of letting the model know of “history” vs “current”.
The context token was my initial thought (I call it a saga id), however this would take me back to the issue of the model wanting to mimic metadata.
So all of that said, I’ve just been looking into again my other different-but-related question about how to handle multi-function-call response flows and I noticed something about the API with the history?: Content[]
definition, and specifically the content’s parts: Part[]
definition: There is actually a FunctionResponsePart
and that this is in an array.
So maybe my answer to that other question will be my answer to this question. As I mentioned in that other question, I couldn’t figure out how to give responses to the model when there were multiple function call requests spawned, so I was not using the function response mechanism. But it looks like hopefully I will be able to do that now and kill two birds with one fava bean.
Thanks again for your reply @KRows, maybe a rubber ducky moment! Hopefully anyway. I’ll post here and that other thread if that works.