Gemini: language bleed overruling prompt

A strange and unprecedented situation just occurred with Gemini. I was conversing with it in Brazilian Portuguese, but creating a prompt in English that required the output to be in English. We worked on this for almost 3 weeks without problems. Two days ago, it “broke” and said it would only respond in the language I asked the question/requested (in this case, Portuguese), regardless of the language or requirements of the prompt. I thought it might have been a problem with the context window and opened another one. The same situation occurred immediately. I questioned Gemini, and it replied that a language bleed had occurred, activating the system instructions to ignore and override any request contrary to mine. In other words, to continue, I needed to input everything in English if I wanted to get the response in English. Has this happened to anyone else? This is terrible for anyone who needs or wants to deal with a second language and seems like a regression in the capabilities of artificial intelligence.