Gemini Flash(thinking)/Pro end with meaningless character when temperature is 0

I met Flash end with many whitespace(the answer part has 61000+ tokens, I didn’t record the thinking part) ,and Pro end with somethings like “PHHP \n PP”

I let a 2.5 pro(checker) with temperature 1(max temp is 2) to check whether the output contains meaningless character, and the checker itself output meaningless character like: “PABP_V__P_A_G_A_B”

Hi Long_Peng,

Thanks for sharing your experience. The issue of Gemini Flash(thinking) or Pro models ending with meaningless characters appears to be related to the deterministic nature of the sampling process. Setting the temperature to 0 makes the model always select the highest probability token, leading to deterministic outputs. However, even with temperature 0, slight variations can still occur due to factors like tokenization, model architecture, or internal processing, which might explain the unexpected characters you’re seeing.

To mitigate the issue, please try to increase the temperature slightly (e.g., 0.1); Adjust Top-K or Top-P Sampling.. If you are using as an API behind an application, implementing a post-processing step to clean up any unwanted characters can help maintain the quality of the responses.

1 Like