Any source for an officially released quantized medgemma 4B-it or 27B-text model ?
There are some out in the wild. However, one is not sure how safe/reliable they are.
I’m want to try out these models, but unable to get the needed TPU/GPU resources on Colab and i don’t have a GPU on my local.
No quantized model from us yet. We’re thinking about it.
Have you tried free accelerators via Google colab?
1 Like
Thanks for the quick response.
If possible, could you please share a notebook that works with Colab TPUs.
I’m able to access v2-4 TPU. However it seems that the TPU isn’t detected and it falls back to the CPU.
Tried various things to fix it, but seems it’s beyond beginner level.
Ack let’s see what we can do. Pls stay tuned.
2 Likes