Cached content is too small. total_token_count=1

I am trying to make Gen AI from Studio work by following the documented examples. The context caching does not work, tried all sorts of combinations.

As suggested in different threads in this forum also, first uploaded the file using File api as shown here.

const {
GoogleGenerativeAI
} = require(“@google/generative-ai”);
const {
GoogleAICacheManager,
GoogleAIFileManager,
} = require(“@google/generative-ai/server”)

const cacheManager = new GoogleAICacheManager(process.env.GEMINI_API_KEY);

const fileData = ${process.env.DATA_ROOT_FOLDER}/full_prompt.txt;
const fileManager = new GoogleAIFileManager(process.env.GEMINI_API_KEY);

            const uploadResult = await fileManager.uploadFile(fileData, {
                mimeType: "text/plain",
            });
            console.log("uploadResult", uploadResult)

uploadResult indicates the bytes to be 165358

uploadResult {
file: {
name: ‘files/hchgla0a7shh’,
mimeType: ‘text/plain’,
sizeBytes: ‘165358’,
createTime: ‘2025-03-31T12:51:07.280031Z’,
updateTime: ‘2025-03-31T12:51:07.280031Z’,
expirationTime: ‘2025-04-02T12:51:07.208250706Z’,
sha256Hash: ‘MDIwOTA0MzdiNjJlOGFkNGI3MzNkMGVjMjEzMzZiOTkzN2IzZjc4NTI2NDcxN2MwZTc0N2M5YWJlODc2NzljNg==’,
uri: ‘https://generativelanguage.googleapis.com/v1beta/files/hchgla0a7shh’,
state: ‘ACTIVE’,
source: ‘UPLOADED’
}
}

Then call caching

const cacheResult = await cacheManager.create({
model: models/gemini-1.5-flash-001,
contents: [
{
role: “user”,
parts: [
{
fileData: {
fileUri: uploadResult.file.uri,
mimeType: uploadResult.file.mimeType,
},
},
],
},
],
});

This breaks with following error always. When would total_token_count = 1 ? Where as upload result shows good size.

GoogleGenerativeAIFetchError: [GoogleGenerativeAI Error]: Error fetching from https://generativelanguage.googleapis.com/v1beta/cachedContents: [400 Bad Request] Cached content is too small. total_token_count=1, min_total_token_count=32768