in the page is it mentioned: " The minimum input token count for context caching is 32,768,"
can i cach less than 32K? and then pay for the min? or else?
tnks!
in the page is it mentioned: " The minimum input token count for context caching is 32,768,"
can i cach less than 32K? and then pay for the min? or else?
tnks!
Hi @Lior_Trieman. Currently, you can’t cache less than 32,768 tokens. If you try with a lesser token you will get the below error :
BadRequest: 400 POST https://generativelanguage.googleapis.com/v1beta/cachedContents?%24alt=json%3Benum-encoding%3Dint: Cached content is too small.
Is there any way for the Gemini to remember you past conversations. Buying 32K tokens is too much. I want the conversations to be rememberd