During testing of the different options of Firebase AI, I encountered a difference between token count. Using Firebase AI there are two options to use - Vertex AI and Google AI (which reflects as Gemini API in Google Cloud Console). I hit the quota limit quicker than expected, so I investigated the token count for the exactly same text. Using Firebase AI constructed with Vertex AI the token count was as expected - 500. Using the constructor with Google AI, the same text had 3170 tokens. With multiple texts I found out this approach adds around 2670 extra tokens.
We are using Flutter with Firebase, the different approaches are constructed as referenced in documentation. Then the tokens were counted by the countTokens method as here.
Why the Google AI approach (Gemini Developer API) adds extra 2670 extra tokens for every request?