Can I run KD knowledge distillation with QAT model?

I known that we usually run KD with float models. But now my teacher is float model and my student is QAT model (from tfmot). Can I run KD with these teacher and student as usually?

Hi @DEV_AI, Apologies for the late response!
Yes, You can run Knowledge Distillation with a floating teacher and a Quantization-Aware Training student, and this is a best practice to improve the accuracy of quantized models. Thanks!