Iam fed up with google :( refusing to translate words ,Really?

you are given words to translate these are ability able about above accept according account across act action from ENGLISH to POLISH and save to file using schema ENGLISH,POLISH , save in file called part0.txt
500 An internal error has occurred. Please retry or report in Troubleshooting guide  |  Gemini API  |  Google AI for Developers

it is refusing simple translations!!!

Create the Generative AI model instance with tools

model = genai.GenerativeModel(
model_name=‘gemini-1.5-flash-exp-0827’,
safety_settings={‘HARASSMENT’: ‘block_none’},
system_instruction=“you are helpfull assistant”,
tools=tool_manager.load_tools_of_type(“all”) # Load all tools initially
)

there is nothg wrong with code becases if i give simple question, it will response, IT IS REFUSING TRANSLATIONS!!

import time
import os
import json
from typing import List, Dict
import re
import google.generativeai as genai
import logging
import re  # Import the 're' module for removing ANSI escape codes

from TOOL_MANAGER import ToolManager  # Import the ToolManager class
from tools.ai.update_focus import update_focus  # Import the update_focus function

# Set up logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

API_KEY = "                 "  # Replace with your actual API key
genai.configure(api_key=API_KEY)

# Initialize Tool Manager
tools_folder = "tools"
tool_manager = ToolManager(tools_folder)
def extract_text_from_response(response) -> str:
    """Extracts the text content from a model response."""
    extracted_text = ""
    for candidate in response.candidates:
        for part in candidate.content.parts:
            extracted_text += part.text
    return extracted_text.strip()


def INTERPRET_function_calls(response, tool_manager) -> List[str]:
    """Interprets function calls from the model response and executes them."""

    results = []
    if response.candidates:
        for candidate in response.candidates:
            if hasattr(candidate, 'content') and hasattr(candidate.content, 'parts'):
                for part in candidate.content.parts:
                    function_call = getattr(part, 'function_call', None)
                    if function_call:
                        tool_name = function_call.name
                        tool_function = tool_manager.get_tool_function(tool_name)

                        if tool_function:
                            # Extract arguments and map them to function parameters
                            function_args = {}
                            for arg_name, arg_value in function_call.args.items():
                                function_args[arg_name] = arg_value

                            try:
                                # Execute the tool function
                                result = tool_function(**function_args)
                                results.append(
                                    f"Result of {tool_name}({function_args}): {result}")
                            except Exception as e:
                                logger.error(f"Error calling {tool_name}: {e}")
                                results.append(f"Error calling {tool_name}: {e}")
                        else:
                            logger.warning(f"Tool function '{tool_name}' not found.")
    return results






# Create the Generative AI model instance with tools
model = genai.GenerativeModel(
    model_name='gemini-1.5-pro',
    safety_settings={'HARASSMENT': 'block_none'},
    system_instruction="you are  helpfull assistant,  dont  use  emojs  in your  reponses",
    tools=tool_manager.load_tools_of_type("all_advanced")  # Load all_advanced tools initially
)


splitcontent = []
batch_size = 10
batch = []

with open("data/words.txt.txt", "r",encoding="utf-8") as file:
    for line in file:  # Iterate through the file line by line
        word = line.strip()  # Remove leading/trailing whitespace from the line (word)
        batch.append(word)

        if len(batch) == batch_size:
            splitcontent.append(batch)
            batch = []  # Reset the batch for the next set of words.txt

    # Handle any remaining words.txt that didn't form a full batch
    if batch:
        splitcontent.append(batch)

# Print the batches to verify the result


for i, batch in enumerate(splitcontent):
    time.sleep(5)
    separator = " , "  # Choose your desired separator
    my_string = separator.join(batch)

    print()
    prompt=f""" translate each word pair {my_string} create  example sentance, both polish and  english and  save  to file{i}.txt 
          example file    pies-dog   dog can bark-pies umie szczekać

"""
    prompt_str=str(prompt)
    print(prompt_str)

    try:
        response=model.generate_content(prompt_str)
        result = INTERPRET_function_calls(response, tool_manager)
        print(response)
    except Exception as e:
        print(e)







Are you seeing this with other models such as gemini-1.5-flash-001? Or is it just with the experimental model?

gemini-1.5-flash-001
‘gemini-1.5-pro-exp-0827’
‘gemini-1.5-pro’

it simply refusing to do any type of organised translation, throwing “500 An internal error has occurred” this is not an issue of the code, becases i can ask in a prompt to create dog.txt file and it will do it, if i request for a few words to translate, bum error 500:

translate each word pair zdolność, ability , zdolny, able , o, about , powyżej, above , zaakceptować, accept , zgodnie, according , konto, account , przez, across , działać, act , działanie, action create example sentance, both polish and english and save to file0.txt
example file pies-dog dog can bark-pies umie szczekać

500 An internal error has occurred. Please retry or report in Руководство по устранению неполадок  |  Gemini API  |  Google AI for Developers

Hi @Dev_DevFuFu

Try with another API key if possible. I tried to reproduce the code and it is working fine from my end. Here is the gist of the code. If you still have issues please let me know.

Thanks

I’ve had successful translations between Hungarian and English with Pro and Flash.

  • Try gemini-1.5-pro-preview or gemini-1.5-flash-preview
  • Try to simplify the prompt, for example leave out the text file name part which is weird
  • What programming language do you use?
  • What are your token limit configurations?
  • How long is your chat history?
  • Try structured output and few shot examples, not just one shot
  • Organize your prompt a little better
  • System prompt could influence the model will be used for translations for example
  • With the 500 can you see anything else on the call stack? Like is the grpc erroring or any faint hint towards the reason? If it was safety setting then the model could communicate that stop reason without 500
    These are all shot in the dark.

the idea was to translate batches of words 10 in a batch:
for each batch i was reinitalising, model witth time.sleep:
the crazy thing is that the difference between error 500 and normal reponse , is content of the prompt: I tried t hardcode the prompt to be sure it does not include any strange markups, but:it seam that the difference is simple dicated by the content of the prompt… and thats crazy:
example: write a story about dog: result OK
example: translate words dog,cat,pig, to polish, and save to file: ERROR 500
example: create file Dog.txt with example story, result OK,

same code different prompt, different result

Just like with a program code keep adjusting the prompt, remove parts (like the weird file name save), more shot examples to narrow down the cause.
There are also hyper parameters like temperature, k, p, settings can be important. You haven’t showed any code.
Susarla here demonstrated that your prompt worked, so the cause is somewhere else