can we use a model that we have tuned in Vertex AI in AI studio? if yes, how?
Hi @Rose_Mirshafian,
Welcome to the Google AI Forum!
![]()
You can, after tuning a foundation model in Vertex AI, the new model is registered in Vertex AI and becomes available for testing within Google AI Studio.
Thank you, Krish! I deployed to the endpoint and my tuned model is in the model registry, however I can’t see any tuned models in AI studio, I just see Gemini Models there. My tuned model is available for testing in Vertex AI studio, though. Do you mean it can be available in AI Studio as well (Not Vertex AI studio)?
Once your model is deployed in Vertex AI, it should be exposed as a secure and scalable API endpoint that you can access from any environment that can make REST or gRPC calls, including a Colab notebook from AI Studio
Here are the steps:
-
Navigate to the Model Registry in the Vertex AI section of the Google Cloud Console.
-
Find and select your registered model.
-
Click on “Deploy to endpoint” .
-
Configure your endpoint by giving it a name, and setting the machine type and scaling options.
-
Click “Deploy” . The process may take several minutes.
Refer to this video for more clarity.
When you fine-tune a foundation model in Vertex AI, the tuned model won’t automatically appear in the Google AI Studio interface — this is by design. AI Studio currently surfaces only base foundation models and certain integrated tuned models, not custom-tuned models from your Vertex AI registry.
However, once your tuned model is deployed to an endpoint in Vertex AI, it becomes fully accessible across environments. You can use the same endpoint from:
• AI Studio Notebooks or Colab, by making API calls to the endpoint.
• Any application or backend using the Vertex AI SDK or REST API.
Here’s the standard workflow:
-
Fine-tune your foundation model in Vertex AI.
-
Verify that the tuned model appears in your Model Registry.
-
Deploy the model to an endpoint (Vertex AI Models Deploy).
-
Copy the endpoint ID and call it via the Vertex AI SDK or REST API:
from google.cloud import aiplatform
endpoint = aiplatform.Endpoint(“projects/<PROJECT_ID>/locations//endpoints/<ENDPOINT_ID>”)
response = endpoint.predict(instances=[{“input”: “your prompt here”}])
print(response)
- You can now access that model seamlessly from AI Studio or Colab using the same API credentials.
In short: The tuned model won’t show up in AI Studio’s dropdown list, but once it’s deployed, it’s fully usable via its endpoint — effectively bridging Vertex AI Studio and AI Studio environments.
For a visual walkthrough, refer to Google’s official video on deploying and invoking tuned models in Vertex AI.
I only see testing methods that use the SDK or call the REST API, so does that mean I cannot use my fine-tuned model directly in the Studio within Vertex AI?
you can use it in vertex ai studio, not in google ai studio. go to your model, test the model in a prompt. you can use it that way in vertex ai studio.