Gemini models with python 3.8

how can i use gemini models (2.5 flash) with python 3.8

genai is not supported in pytohn versions < 3.9

1 Like

@Rim_Benjaballah , Welcome to the community.

you can always use a library like “requests” to make a REST API call.

you can try something like the following.

import requests
import json
import os

api_key = “xxx–your api key here–xxxx”

url = “https://generativelanguage.googleapis.com/v1beta/openai/chat/completions”

headers = {
‘Content-Type’: ‘application/json’,
‘Authorization’: f’Bearer {api_key}’
}

payload = {
“model”: “gemini-2.0-flash”,
“messages”: [
{“role”: “user”, “content”: “Explain to me how AI works”}
]
}

try:

response = requests.post(url, headers=headers, json=payload)
response_data = response.json()
print("Request Successful!")
print("Response:")
print(json.dumps(response_data, indent=2))

except requests.exceptions.RequestException as e:
print(f"Request failed: {e}")