Comment by PaulDavisThe1st
21 days ago
This seems to assume that debugging is required to get software to work properly. Sometimes that's true, sometimes it isn't.
And even when it is, sometimes the "not working properly, must debug" point occurs later in time (sometimes much later) from the "it appears to be working" point.
And I might say that the more clever (skilled, experienced, wise) a programmer is, the more likely they can write something 'clever' and make it work and not need to debug it. Then why should they follow Kernighan's quote to avoid working that way?
As I take the quote, it would have been about 1970's, 1980's C and would not have had the benefit of an IDE with "edit and continue" or a LISP or Prolog or Smalltalk interactive REPL with live edit and retry, or ELM's "time travelling debugger" or Git and all related tooling for tracking down changes and who made them, or more modern fuzzers and Valgrinds and static analyzers.
Making a case for writing non-surprising idiomatic code is one thing, but HN parroting the "debugging is twice as hard as coding" and downvoting someone who asks for evidence for this claim is cargo-culting. Why would it be twice as hard and not 1.2x as hard or the same hardness or 10x or 100x as hard? And why would the relationship be fixed, even if tooling and languages and the industry change? And what does it mean to say you can write a, say, compression algorithm "as cleverly as you can" but it's twice as hard as that to spot that you typoed a variable name or something?