What are the efficient ways to dump and load chat thread/history?

Hi, I think I have discussed this how to persist chat history, but right now my solution is to dump the entire ChatSession.history object (behind model.start_chat(history=_chat_thread).history) using jsonpickle to encode as JSON to database (I use Mongo) and decode it later

While I found “this works”, I also found that this can be inefficient… I’m concerned how future-proof this solution as the Gemini Python SDK may change things in the future which I wouldn’t expect things would be broken if the chat thread is loaded back with such backwards incompatible changes

The reason why I dump this python object (which consists of Protobufs which I have no idea to use and deal with) and pickled / encoded it with JSON which it contains the full context of the conversation including uploaded attachment and tool use context to maintain conversation consistency.

Right now this is the implementation

How chat thread is stored (using mongodb)
image

This kinda feels wrong but idk if this is good solution to my end

Now I was wondering how you guys store chat history, apart from this current solution. Or atleast how to effectively dump/load them

Since under the hood the history at some point will be interpolated into the prompt I wonder if it’s possible to access that format somehow. That’d be a single long string, and some meta data could be lost compared to a pickle (such as function calls?).
Also keep in mind that since history stuffs the prompt, you want to curb it to not bloat the prompt too much (latency is larger with the prompt, also cost).
But I also gave this issue when a chat goes out of scope and the native history is lost, I might want to persist and load that at another day (same as what Gemini mobile app or ChatGPT or Copilot mobile app offers when it saves conversations)