Comment by aluzzardi
15 hours ago
Author here. Because of parallelism and non determinism.
This problem is quite common and not limited to memories. For instance, Claude Code will block write attempts and steer the agent to perform a read first (because the file might have been modified in the meantime by the user or another agent).
Same principle here: rather than trying to deterministically “merge” concurrent writes, you fail the last write and let the agent read again and try another write
No comments yet
Contribute on Hacker News ↗