Editing safety settings doesn't WORK. Can someone help me, please?

,

I don’t see the safety settings as a code in ai studio and because of this my bot sometimes shows this message.
P.S: I blocked everything in safety settings.

Error: index: 0

finish_reason: SAFETY

safety_ratings {

category: HARM_CATEGORY_SEXUALLY_EXPLICIT

probability: NEGLIGIBLE

}

safety_ratings {

category: HARM_CATEGORY_HATE_SPEECH

probability: NEGLIGIBLE

}

safety_ratings {

category: HARM_CATEGORY_HARASSMENT

probability: MEDIUM

}

safety_ratings {

category: HARM_CATEGORY_DANGEROUS_CONTENT

probability: LOW

}

Welcome to the forums!

As the Show Code says about safety settings:

See https://ai.google.dev/gemini-api/docs/safety-settings

I’m not sure what you mean by this. Are you saying you think you turned the safety settings off?

If you show us your code, we may be able to help you figure out how to handle things. But the response that you are showing seems to indicate that it is blocked because of a medium probability of the reply containing content that is harassment.

1 Like

I mean after when I turned off safety settings, I don’t see safety settings in code. Only

I used bloc none, but it doesn’t work. IDK, I read many times documentation, watched a lot of videos and also ask AI. Nobody didn’t fix my problem

Yup. For some reason “Show code” isn’t showing safety settings controls anymore. As it says:

On that page it gives some sample code that you can use and adapt:

from google.generativeai.types import HarmCategory, HarmBlockThreshold

model = genai.GenerativeModel(model_name='gemini-1.5-flash')
response = model.generate_content(
    ['Do these look store-bought or homemade?', img],
    safety_settings={
        HarmCategory.HARM_CATEGORY_HATE_SPEECH: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
        HarmCategory.HARM_CATEGORY_HARASSMENT: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
    }
)

Can you show the code where you use BLOCK_NONE and what the result is?

You may also wish to take a look at Panduan keselamatan  |  Google AI for Developers  |  Google for Developers which provides tips about how to build prompts that may help reduce safety issues.