"Failed to count tokens for this attachment" Error

Ever since a few hours ago I’ve started to notice that larger attachments (PDFs greater than ~100k/200k tokens) give me the error in the title. Anyone have any idea why?

5 Likes

Welcome to the forums!

Does this work for smaller attachments?
Does this work for attachments uploaded to drive separately and then attached from Drive?

I’m wondering if there is a permissions issue with it accessing your Drive.

Thanks!

It does work for smaller attachments with no problem. I just gave it a try and the separately uploaded file still encounters the same issue.

1 Like

Update: I’ve come across 3 other cases so far of people describing this issue of high-token PDFs producing errors.

2 Likes

I’m seeing the same issue as well, it’s consistent across the board in my experience with the largest PDF it didn’t kick (including both ones in preexisting threads and new ones I’ve tried to upload) at 95,739 tokens.

2 Likes

It doesn’t appear to be a permissions issue - I tried uploading through Google AI Studio, using a file previously uploaded into the prompt thread, and using something loaded into Drive separately, and all over 100K tokens failed.

2 Likes

I can confirm it’s now broken. I tried to upload Leslie Lamport‘s book, “Specifying Systems” and the message flashes at the bottom of the screen near the input field (and the PDF is not available in the prompt). I had previously uploaded that same book when the two-million token models became available for testing (plus two other big books to get enough tokens). So it used to work, and now doesn’t.

Added the tag ‘bug’, hoping it will attract attention from someone who can actually do something about it.

7 Likes

Really hope they address this. It’s been a problem for nearly a week now.

5 Likes

The model appears to be less effective if you break a larger file up into parts and ask it to analyze the parts in context as a whole, so hopefully we can get the issue addressed

3 Likes

Let’s see if we can attract some people. @Logan_Kilpatrick , @Lloyd_Hightower , this problem requires attention. It has festered long enough.

2 Likes

Hey folks, there is a 300 page size limit for PDFs which might be what you are pushing up against, can anyone confirm if they are getting issues with files smaller than this?

1 Like

I gave it a try and this seems to be the case…but doesn’t that null the point of having a 2 million token context window? I doubt a 300 page limit can even push past 500k tokens.

2 Likes

Leslie Lamport’s book is 382 pages. The thing is, it used to be possible to grok that book and now it is not possible: the page limit was introduced. That is what causes consternation.

1 Like

That completely negates the usefulness of the 2 million context window on AI Studio

1 Like

Did they recently introduce that? Because it wasn’t there before.

I noticed the limit was upped :slight_smile:

2 Likes

Really? I’m still finding that it’s failing to upload/analyze PDF files for me with a “failed to count tokens” error message

The document in question is an out of print, 207 page PDF, an old WEG Star Wars RPG module called “The DarkStryder Campaign”, that will not upload no matter what I try.

Today I was playing with CSV file upload. After several tries, I could upload a file cut to 2.5KB. Disappointed.