I remember reading back in late March that 2.5 pro would get a two million context window within Google AI studio. Is this still something in the works?
Thanks you!
I remember reading back in late March that 2.5 pro would get a two million context window within Google AI studio. Is this still something in the works?
Thanks you!
Considering the exponentially increasing cost of usage even within 1M contwxt, having it increased up to 2M would hardly be useful to anyone, assuming the cost would not be cut in half (at least).
So I guess, that focusing on effectiveness of LLM models shouldn’t bring more value added than just fighting for higher context window…
Another fact is that even within the current 1M limit , Gemini still returns very messy outputs even if the context is at level od 300 - 400k.
We still need to wait for some breakthroughs that are implementable and manageable from the cost perspective.