Is there a way to download a fine tuned model for local inference?
Welcome to the forum! The Gemini models can’t be used for local inference. They only live within the Google infrastructure. The Gemma model can be used locally, and the same is true with some other well known opensource models too.
Hope that helps
So if cannot download the fine tuned model for local inference, Is it possible to get the exact code that was used to fine tine the model so that I can fine tune a model on my local machine?
The Gemini models are not opensource, you can’t download Gemini on your local machine. There are very good opensource models available that are used for these cases. Google has made Gemma available, and there are other very good opensource models for you to choose from.
Gemini is not open source, and testers who run it locally may experience slower performance. I have downloaded a Gemini model on my computer. The response time for “Hey” was 5 seconds, which could have been faster if the prompt had been run through the cloud.