Gemini Files 20 is not enough

Hello,

I’m currently using Gemini 2.5 Flash to generate AI-powered comments on LinkedIn. I’ve built a software that allows hundreds of users to generate comments with AI, and part of the process involves letting users upload their own files directly to Gemini Files.

Here’s the issue:

  • Files uploaded to Gemini Files are automatically deleted after 48 hours.
  • For now, I’ve implemented a workaround: a cron job that re-uploads the files every 24 hours.
  • However, this creates another problem: we are starting to hit the 20 GB storage limit that Google allows, and this approach is not scalable.

I also tried using URLs pointing to files hosted on our own Supabase Cloud storage, but the results are not as good: latency increases significantly, and the model tends to hallucinate more often compared to when files are directly uploaded to Gemini Files.

What I need is a solution that would allow:

  • Thousands of users to upload multiple files each,
  • Reliable file access for Gemini without the 48h expiration,
  • The same speed and accuracy we get when using Gemini Files.

:backhand_index_pointing_right: Is there any recommended way to persist files longer than 48h, or to integrate external storage (Supabase, GCS, etc.) in a way that performs as well as Gemini Files?
:backhand_index_pointing_right: Are there any upcoming changes or best practices for handling this at scale?

Thanks a lot for your help!

+1, this is not for production use at the moment

@Yannis_Haismann ,

at this point there is no way around the 20GB limit. please understand having the files in the files API will have a lot of computation toll on implementation .i.e these files are readily tokenized and Embedded to be added to the LLM inference as per the api call

if you would like to have a large number of files to be used as context for gemini , please use a GCS container and have a data engineering process to convert these files to embeddings and store them in a vector store and get the context for each comment you generate.

Thankyou