The length of question token exceeds the limit (64) in React.js

Hello,

I am using tensorflow and bert in react.js and by submitting a question I get this error:
The length of question token exceeds the limit (64)

this is the code which execute the question:

  const answerQuestions = async (e) => {
    if (e.which === 13 && model !== null) {
      const answers = await model.findAnswers(passage, question);
      setAnswer(answers);
      console.log(answer);
    }
  };

It seems that the token length should be sized up, but I am not sure if thats the case and how to do that.

The error message you’re encountering suggests that your question input to the BERT model contains more tokens than the model can handle.

Tokens are not equivalent to words; they can be pieces of words or punctuation, as BERT uses subword tokenization (like WordPiece or SentencePiece)