Files API increase quota beyond 20GB?

I am using the Gemini API with the Files API for video understanding on large files on behalf of users in a unique case.
I built a pipeline that handles several concurrently and removes them once inference is complete.
However it seem like I am hard capped to 20GB per Google Console project Gemini API token.
Ideally I would not have to spawn several projects and maintain several tokens.
It would be much easier if I could increase quota for requests and file storage to have higher concurrent capacity.
Is this possible with API tokens? Can someone at Google increase this? It seems rather arbitrary when you can make multiple projects to accomplish the same thing.
I’d rather avoid vertex with google cloud storage since some users might want to bring their own token to manage their own data.

Hi @Evan_Lesmez,

Welcome to the Google AI Forum! :confetti_ball: :confetti_ball:

Can you share which are you currently using and your project tier?

Hi Krish. Which project I am currently using or which model?
I am using API key(s) generated from https://aistudio.google.com/projects.
Every project I have created is listed as Tier 1 for quota.
I target Gemini-2.5-Pro.

I setup vertexai before with ADC credentials but it was a lot of overhead and like I said some users might want to manage their own data without going into my bucket.

So is this possible or do I need to hack my way around the imposed quotas?