Hi,
here is my configuration :
Antigravity Version: 1.19.6
VSCode OSS Version: 1.107.0
Commit: d2597a5c475647ed306b22de1e39853c7812d07d
Date: 2026-02-26T23:07:23.202Z(1 day ago)
Electron: 39.2.3
Chromium: 142.0.7444.175
Node.js: 22.21.1
V8: 14.2.231.21-electron.0
OS: Darwin arm64 25.3.0
Language Server CL: 875890115
It displays “Agent Loading” and nothing loads.
What can I do to fix this?
Thank you for your help.
1 Like
Hi, I have the same problem too, but I think that I also have the problem when auth to sign in at direct link by google antigravity (Chrome browser). I’ve done with my account choosing but the view has been jumping automaticaly into google one packet. It’s not really the login page however I choose free plan before. On my google account the one has subscribed the free trial of gemini pro 3.1 and one not, but when I try login to both they still same. So anyone can help me to how to do I solve these?
Hi,
Solution: Preventing context loss when conversations become too heavy to load
I had the exact same issue — my conversation grew too large over several days of intensive work and eventually wouldn’t load anymore. When I started a new conversation, the AI had zero memory of everything we’d built together.
Here’s the solution I implemented to make sure this never causes a problem again:
The Problem
Long conversations accumulate too much context (messages, artifacts, tool calls). At some point, the conversation becomes too heavy to load and you’re stuck starting fresh with an AI that doesn’t remember anything.
The Solution: A Startup Workflow
I created a workflow file at .agents/workflows/demarrage.md
.agents/workflows/demarrage.md that acts as an automatic “memory restore” procedure. Every time I start a new conversation, I just type/startand the AI reads all of its memory files and is back up to speed in under 2 minutes.
The key is maintaining a “cold memory” file — a markdown document that contains everything the AI needs to know: project context, technical environment, rules, history of past interventions, etc. This file lives in your workspace and gets updated at the end of every session.
How to set it up
-
Create a memory file in your workspace (e.g., AI_MEMORY.md) containing:
-
Your identity and preferences
-
Technical environment details
-
Rules and working methodology
-
History of past sessions/interventions
-
Current TODO items
-
Create a workflow at .agents/workflows/startup.md:
yaml
-–
description: Startup workflow — AI reads its memory and restores full context
------
Then add steps that tell the AI to read your memory file(s) and summarize what it understood before doing anything else.
- Update the memory file at the end of every work session — this is crucial! If you don’t update it, you’ll lose the latest context.
Why this works
-
The AI doesn’t rely on conversation history anymore — it relies on files
-
Files persist across conversations, unlike chat history
-
The workflow ensures the AI reads them systematically at every startup
-
You can also ask a second AI agent (if you have one, like a bot on your server) to serve as a “living backup” of context
Lesson learned
Don’t trust conversation continuity for critical projects. Treat your AI’s memory like a database: persist it to files, and have a restore procedure ready.
Hope this helps! 
Hi Everyone,
Thank you @Romain_Antigravity for bringing into our attention and providing community the solution. We have escalated the issue to our internal teams for a thorough investigation. Please follow the main thread for the updates.
1 Like
I have a continue loading on my current conversation, any way to recover it somehow?
This still isn’t fixed? damn
Edit: Seems to be fixed in the newest version 1.20.3. Had to manually download and update as auto-update isn’t triggering.