Inquiry Regarding QAT Version of Gemma-27B Model

Hello,

I am currently working with the Gemma family of models and wanted to inquire specifically about the availability of a Quantization-Aware Training (QAT) version of the Gemma-27B model.

Could you please confirm whether a QAT-trained or QAT-ready variant of Gemma-27B exists? If so, I would greatly appreciate any documentation, technical details, or guidance on accessing and deploying it.

Additionally, because we intend to use the model for commercial purposes, could you please clarify under what license the QAT model (or its weights) is provided?
Specifically, we are looking for usage under MIT, Apache-2.0, or a similarly permissive license to ensure compatibility with our use case.

Thank you for your time and assistance. I look forward to your response.

2 Likes

Hi @Dibyajyoti_Mishra
Thank you for reaching out.

Yes, the official Quantization-Aware Training (QAT) variant of the Gemma-27B model is available. All the documentation, technical details, and licensing information you need are contained within the official Hugging Face Model Card which is attached for your reference.
Check this out https://huggingface.co/google/gemma-3-27b-it-qat-q4_0-gguf

Thanks