Comment by majormajor
15 hours ago
Having the humans document the code seems backward (maybe that's not what they're doing, but "make everything ready for ai" sound manual). And hopefully there aren't that many scary surprises that humans need to manually document.
One of the best parts of LLMs is that you can use them to bootstrap your documentation, or scan for outdated things, etc, far more quickly than ever before.
Don't just throw a mountain at it and ask it to get it right, but use a targeted process to identify inconsistencies, duplicates, etc, and then resolve those.
And then you have better onboarding material for the next human OR llm...
The humans are the only ones knowing the why. The what is cheap.
Oddly enough, asking an AI to add docs to a classfile explaining "what it does, why it needs to exist, and what uses it" is a great way to include some of the "why". I know it's not ALL the why, but it does a pretty good job of finding the reasons that someone new to the code wouldn't be aware of.
> Having the humans document the code seems backward (maybe that's not what they're doing, but "make everything ready for ai" sound manual).
No, that's forward. Any documentation an AI can make, another AI can regenerate. If an LLM didn't write the code, it shouldn't document it either. You don't want to bake in slop to throw off the next LLM (or person).