Aither protocols and checkpoint

A new generation of prompts

this is going to change the current understanding of prompts and use the Agent as a smart script to be as predictable and precise

why ?

high token count but precise and expected output

less thinking

more generated content

in the experiment below :

130k+ token count on load
15s thinking
200s generating

cons: have to reset the session after making a checkpoint, which is about every response

google.dev wouldn’t let me share pictures because it’s my first day here😭, but i can share it elsewhere, I’ll even provide a video of how it works
but i need help on this project

helĺo how r u
what do u whant like help
tell me