Impressions from using the 1206 model?

  1. Finally, less censorship. I use Gemini in Polish, and many synonyms or word combinations would randomly interrupt message creation. This issue has significantly decreased in this model. A step in the right direction.
  2. The model handles large prompts (5k+ tokens) much better, especially when system instructions are in JSON format. I’m genuinely surprised by the quality improvement. Another step in the right direction.
  3. Unfortunately, there’s still a noticeable drop in quality after 32k tokens. It’s as if contextual gaps appear—forgetting the conversation’s tone or specific details. This has been a persistent issue with Gemini. A large context window, but seemingly “leaky.”
  4. Unlimited usage. A very good move by Google. If they aim to attract new users, this could be quite appealing. Plus, additional training data.

Suggestions for development:

  1. I remember back in March when I had unlimited access to GPT, Claude, and Gemini. Only Claude Opus seemed to “understand the idea of a system prompt.” It’s one thing for a model to stick to instructions, but mindlessly following them sometimes kills performance quality. It’s crucial for a model to grasp the idea behind the instructions, as this significantly boosts quality.
  2. The ability to create model chains in AI Studio? While it’s no issue in the API to set up a scenario where, for example, three models handle specific tasks before their outputs are fed into a main model, it would be great to have something like this in AI Studio. Similar to Flowise or n8n.
3 Likes

Good points, havent had too much to do with Gemini before but the 1206 does seem to hold up well against Claude and GTP as it seems clever but doesn’t over do it, like 1o does.

So a good start and for what you mentioned seems to be the easiest to set up for multi models in one workflow.

can it use function calling?

I found the model very intelligent in understanding what I was thinking, and did better than my expectations.

3 Likes

In the options, there is a possibility; I haven’t personally tested it.