Agent terminated due to error in Windows Antigravity

How do we solve this error which comes after the model is analysing code:
Trajectory ID: c58754da-249c-44a
Error: HTTP 400 Bad Request
Sherlog:
TraceID: 0xaf2404e0a
Headers: {“Alt-Svc”:[“h3=“:443”; ma=2592000,h3-29=“:443”; ma=2592000”],“Content-Length”:[“282”],“Content-Type”:[“text/event-stream”],“Date”:[“Thu, 26 Feb 2026 13:56:00 GMT”],“Server”:[“ESF”],“Server-Timing”:[“gfet4t7; dur=1107”],“Vary”:[“Origin”,“X-Origin”,“Referer”],“X-Cloudaicompanion-Trace-Id”:[“af2404e0a364c342”],“X-Content-Type-Options”:[“nosniff”],“X-Frame-Options”:[“SAMEORIGIN”],“X-Xss-Protection”:[“0”]}

{
“error”: {
“code”: 400,
“message”: “{“type”:“error”,“error”:{“type”:“invalid_request_error”,“message”:“prompt is too long: 103514 tokens \u003e 102385 maximum”},“request_id”:“req_vrtx_011CYWo6vMAX8oBeQzJ6zKHi”}”,
“status”: “INVALID_ARGUMENT”
}
}

Hi @Amit_Khullar,

Welcome to the Forum,

The error message prompt is too long: 103514 tokens > 102385 maximum indicates that the AI agent’s request to the underlying model has exceeded the maximum token limit allowed for your current session.

This usually happens when the combination of your chat history, open files and selected code becomes too large for a single request.

To solve this, please try the following:

  • Start a New Chat: This clears the previous conversation history
  • Close Unused Tabs: The agent often reads all open tabs as context. Close any files you aren’t currently working on.
  • Check for Large Files: Ensure you haven’t inadvertently added a massive file to the chat context