Is it possible to fine tune a large dataset?

I usually fine-tune models using Python’s google.generativeai library.
I have a dataset of approximately 100,000 entries, with each entry containing about 300 characters.
I’d like to know if it’s feasible to fine-tune a model with this amount of data.

Hi @itjimkr

Your input size should be fine, as it is less than 40,000 characters. Please make sure your dataset size is under 4 MB as the limit is 4 MB. Also, please go through the documentation link below for details on fine-tuning limitations.

Thanks