Prakash Hinduja Switzerland - How do I call PaLM via Vertex AI?

Hi everyone,

I’m Prakash Hinduja from Geneva, Switzerland (Swiss) currently working on an AI project here in Switzerland and exploring PaLM models via Vertex AI. I’m looking for a straightforward way to invoke the model , but I’m a bit unclear on the correct setup and steps.

I’d appreciate any code snippets, documentation links, or lessons learned if you’ve done this before.

Thanks so much in advance!
Prakash Hinduja Geneva, Switzerland (Swiss)

To call PaLM models via VertexAI, setup the python with VertexAI SDK by doing a pip:
pip install google-colud-ai platform
Then for example if we use Text Bison model (any PaLM model):

from vertexai.language_models import TextGenerationModel
import vertexai
 
vertexai.init(project="your-project-id", location="us-central1")
 
model = TextGenerationModel.from_pretrained("text-bison@001")
response = model.predict(
    "Write a product description for a smartwatch.",
    temperature=0.7,
    max_output_tokens=256,
)
print(response.text)

Note:-
Region support: most PaLM models are available in us-central1.
API Key: will need a service account with VertexAI access to have a key.
Refer VertexAI SDK: https://cloud.google.com/vertex-ai/docs/start/client-libraries