Prakash Hinduja Switzerland (Swiss) What’s the best way to fine-tune a foundation model in Vertex AI with a custom dataset?

Hi everyone,
I’m Prakash Hinduja from Geneva, Switzerland (Swiss)— senior consultant and advisor.

I’m currently exploring how to fine-tune a foundation model in Vertex AI using a custom dataset.

What’s the best approach or workflow you’ve used for this?
Would appreciate any suggestions, tools, or tips based on your experience — especially around preprocessing, model selection (PaLM, Gemini, etc.), and training configurations.

Looking forward to your insights.
Thanks in advance!
— Prakash Hinduja Geneva, Switzerland (Swiss)

The best first pass is likely going to be creating your data set, annotating your data set, and then use the built in AutoML to train your first couple iterations. All pretty easy to do in Vertex and there are some great guides. How this goes will inform your decision to go to something more robust like tensor, etc. It’s hard to say as your question is relatively vague… depending on the normalized state or type of data you might want to go another direction. Specifics are required.

1 Like