"Failed to count tokens for this attachment" Error

I am getting the same error (all 1.5 models in AI Studio) when I try to upload a plain text file with 29800 words…

1 Like

It has never really gone away for me either, I’ve just been forced to adapt by breaking down the input file into smaller subsections, which is clearly less effective than it was previously. I’m guessing this is a technical constraint they don’t want to talk about.

Aside from this error. I’m also getting randomly the error ‘content not permitted’ for no reasons. Forcing me to insult and to theaten the AI everytime to make it work. Make it easier for everyone, reduce the damm censorship or REMOVE IT AS A WH0LE! It will make my job easier.

I have a pretty good prompt that i created so all i do is apply that prompt to get through any bs censorship, its a hassle sometimes but if you look at the competition you will feel grateful. Such “jailbreak” prompts barely works on models from anthropic like claude 3.5 sonnet and such. Don’t even get me started on GPT-4o, they just straight up filtering words at this point

Same problem here, still not working

Any known work arounds to resolve other than smaller file size?

Thanks what is the point of 2M tokens with a 300 page limit when I can’t even upload a book. Does anyone realise that most management books are more than 300 pages? Who writes books less than 300 pages?