Why Does Gemini Misinterpret Neutral Texts as Negative?

I wanted to translate the following text from Persian to English using Gemini Experimental 2.0 Flash:

دو راه برای زندگی وجود دارد: پوسیدن یا سوختن. مردم ترسو و ناپاک فاسد می شوند. مردم شجاع و درستکار می سوزند.

However, the tool refused to translate it, citing “harsh” content (despite me ensuring it adheres to all guidelines and removing any potential triggers).

Interestingly, other AI translation tools easily translated it as:
“There are two ways to live life: to rot or to burn. Cowardly and impure people rot. Brave and righteous people burn.”

I hope this issue can be addressed to ensure fair and accurate translations without unnecessary content restrictions.

This over-sensitivity forces users to experiment with phrasing unnecessarily, creating frustration. It would be great if the system could differentiate better between harmful content and neutral, metaphorical language.

Hi @Muhammad_Wildan_Habi, Welcome to Forum!!!

I tried to reproduce the issue and it’s not blocking anything. Can you follow this gist. I also tried the same in AI Studio, it’s working fine there also.

It’s might be an intermittent issue you faced, can you try once again.

Thanks.