Hi Gemma community,
I’m a developer who is currently experimenting with integrating the Gemma 3 270M model into a Flutter application. I’ve become very curious about how to bring out the full potential of this model. For example, I’m interested in fine-tuning know-how for specific tasks or best practices for writing prompts that maximize performance.
I’m also curious about successful use cases and the creative ways other developers are using the 270M model.
1 Like
Hi @itjimkr ,
Welcome to the Google AI Forum.
You are correct that you can fine-tune the Gemma models for specific use cases. There are two primary variants of the 270M model available on Hugging Face: the base model (google/gemma-3-270m) and the instruction-tuned model(google/gemma-3-270m-it) from hugging face.
The base model is trained on general data and provides broad knowledge, whereas the instruction-tuned model is specifically designed to follow a user’s instructions. Your choice should depend on the specific requirements of your project.
For more information on the Gemma 270M models, please visit the official model card page or visit the following page to know more about it. To learn how to fine-tune the model, you can refer to the page.
Thank you.
1 Like