I am facing issues while streaming the response using vertex ai, I have looked many docs not able to find the way to handle the safety response, I want to raise http expectation in case of chunk failed due to safety warning here is the example how u am trying to do it
for response in responses:
logger.debug(“response chunk %s”, response)
If “finish_reason” in response and response[“finish_reason”] != “SAFETY”:
logger.debug(“response chunk data %s”, response_text)
yield response.text, self.conversation_id
generated_response += response.text
else:
logger.warning(“Response is empty or blocked by safety filters.”)
raise HTTPException(
detail=“The response was blocked due to safety concerns. Please try rephrasing your query.”
)