Experiencing Error 503 Failed to Count Tokens

Hey All

I am experiencing error 503 since yesterday with this error:
code": 503, “message”: "Failed to count tokens.

It occurs in my importFile EP to import from the /files folders with some metadata, it is a json file which splits according some IDs I have inside.

Someone also experiencing it?

This is my Json:
{
“fileName”: “{{ $json.file.name }}”,
“customMetadata”: [
{
“key”: “type”,
“stringValue”: “lakoach”
},
{
“key”: “sochen”,
“stringValue”: “{{ $(‘List a specific search store1’).item.json.displayName }}”
},
{
“key”: “lakoach”,
“stringValue”: “{{ $(‘Loop Over Items1’).item.json.Mutzar.NetuneiMutzar.YeshutLakoach[‘MISPAR-ZIHUY-LAKOACH’] }}”
}
],
“chunkingConfig”: {
“whiteSpaceConfig”: {
“maxTokensPerChunk”: 512,
“maxOverlapTokens”: 100
}
}
}

Any Idea, someone who can assist?

Hi @Roee_Aizman , Welcome to the AI Forum!!!

Thanks for reaching out to us. Could you please share the Gemini model you are using ?

Hey @Sonali_Kumari1 this is the filesearch API.

Aka RAG.

The issue is some limitations of the json file.

It is much below 10MB and eben 1MB.

it is the depth of the hierarchy or the key name length.

I couldnt find any limitations with regarding of it.