Comment by hinkley

1 day ago

> I have my doubts about any CS class/lecture, that teaches, that the "iterative version is easier to scan".

Well then you’re in luck because that was not an academic statement but a professional opinion - mine.

You can’t optimize the patterns you can’t see because they’re obscured by accidental complexity. And iteration is a frequent trick I use to surface deeper pattern s.

Things like DFS add a lot of noise in the way of seeing the pattern IMO, but then again if you want explicit stack management and that's the pattern you want to see I suppose the iterative versions are clearer.

  • I think this is in the same space as 'imperative shell, functional core'. Because what ends up happening is that in a tree or DAG you keep swapping back and forth between decision and action, and once the graph hits about 100 nodes only people who are very clever and very invested still understand it.

    A big win I mentioned elsewhere involved splitting the search from the action, which resulted in a more Dynamic Programming solution that avoided the need for cache and dealing with complex reentrancy issues. I'm sure there's another name for this but I've always just called this Plan and Execute. Plans are an easy 'teach a man to fish' situation. Sit at one breakpoint and determine if the bug is in the execution, the scan, or the input data. You don't need to involve me unless you think you found a bug or a missing feature. And because you have the full task list at the beginning you can also make decisions about parallelism that are mostly independent from the data structures involved.

    It's not so much enabling things that were impossible as enabling things that were improbable. Nobody was ever going to fix that code in its previous state. Now I had, and there were options for people to take it further if they chose.