why do AI agents and LLMs not have swap space by default? why do we insist that they keep massive context windows in active memory? this isn’t a new concept - machines have used paging space for a very long time.
i regularly have my agent go into a paging space directory, create a temporary file, dump everything from active memory that it’s just hanging on to but not using. it makes it much less likely it’ll start becoming confused.