Gemini Files 20 is not enough

Hello,

I’m currently using Gemini 2.5 Flash to generate AI-powered comments on LinkedIn. I’ve built a software that allows hundreds of users to generate comments with AI, and part of the process involves letting users upload their own files directly to Gemini Files.

Here’s the issue:

  • Files uploaded to Gemini Files are automatically deleted after 48 hours.
  • For now, I’ve implemented a workaround: a cron job that re-uploads the files every 24 hours.
  • However, this creates another problem: we are starting to hit the 20 GB storage limit that Google allows, and this approach is not scalable.

I also tried using URLs pointing to files hosted on our own Supabase Cloud storage, but the results are not as good: latency increases significantly, and the model tends to hallucinate more often compared to when files are directly uploaded to Gemini Files.

What I need is a solution that would allow:

  • Thousands of users to upload multiple files each,
  • Reliable file access for Gemini without the 48h expiration,
  • The same speed and accuracy we get when using Gemini Files.

:backhand_index_pointing_right: Is there any recommended way to persist files longer than 48h, or to integrate external storage (Supabase, GCS, etc.) in a way that performs as well as Gemini Files?
:backhand_index_pointing_right: Are there any upcoming changes or best practices for handling this at scale?

Thanks a lot for your help!

+1, this is not for production use at the moment