Pro 2.5 Preview. Thinking problems

So, I have been experiencing many issues in the following days with the Thinking, not only it seems that is it locked on “ON” and cannot be disabled sure. I don’t see any issue with that, but the Thinking? It seems that there is a huge problems in long token context, let’s say up to 300’000 token? And the Model itself will keep ignoring your Prompt request to activate the thinking if you’re using a huge, long complex Prompt each times, even if you were to spam.

[DONT BE LAZY! ACTIVATE THE THINKING!] mutiple times in the same prompt, it would not work at this point. Can anyone report this bug? Is it meant to always Think, so why does it ignore the Thinking Mode in Studio? I need the Model to think everytime, it don’t have the same output quality if it does not Think first. When this bug will be fixed?

Hi @RPGBlaster02
Try splitting your long prompt into chunks and prompting Gemini to “Think” after each part.
Waiting on official confirmation or fix from Google — but for now, context reduction helps.