← Back to context

Comment by mike_hearn

2 days ago

Ironically, this article is full of the sort of semantic confusions that cause the problem in the first place. The reporter clearly hasn't run the article past an actual programmer as she seems to think this outcome is a deliberate design choice rather than a bug:

> Null was first programmed 60 years ago by a British computer scientist named Tony Hoare ... Hoare probably wasn’t thinking about people with the 4,910th most common surname. He later called it his billion-dollar mistake, given the amount of programmer time it has used up and the damage it has inflicted on the user experience.

Obviously Hoare's statement wasn't about this problem at all. She's also giving readers the impression Microsoft has some sort of policy against using null values:

> “It’s a difficult problem to solve because it’s so widespread,” said Daan Leijen, a researcher at Microsoft, who says the company avoids use of null values in its software.

Whatever Leijen said, I'm pretty sure it wasn't that.

I really don't get why journalists so rarely do basic fact checking of their own articles by asking an independent source for a final read-through. Many of them actually have policies against doing this, which leads to an endless stream of garbled articles that undermines their credibility without them even noticing.

> > “It’s a difficult problem to solve because it’s so widespread,” said Daan Leijen, a researcher at Microsoft, who says the company avoids use of null values in its software.

> Whatever Leijen said, I'm pretty sure it wasn't that.

I had a good lol when I read that, imagining some top-level decree to NEVER use null values in any context in all of Microsoft

  • Afaik, there does exist a decree in most teams at MS to not use null if it can be avoided, and any attempt to do so is likely going to be flagged in code review - or so I'm told.

> Whatever Leijen said, I'm pretty sure it wasn't that

What makes you so sure? This is Hoare's apology for creating the null reference:

> I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

Given that null references cause crashes, why would it be unreasonable for a researcher at Microsoft to say they try and avoid them? Is this journalist really so far off?

> I really don't get why journalists so rarely do basic fact checking of their own articles

If you compare journalism to our other sources of information - such as comments on Hacker News and posts on social media - I think it holds up quite well, especially when the outlet is a reputable organization. It's quite fashionable for technology people to be highly critical of what they (pejoratively?) call "legacy media", but the alternatives that the technology industry have brought forward, like social media, are far, far worse in terms of accuracy, and also do very little of the kind of investigative reporting that is crucial for holding powerful officials to account.

  • Re-read what the article says.

    > Null was first programmed 60 years ago by a British computer scientist named Tony Hoare ... Hoare probably wasn’t thinking about people with the 4,910th most common surname.

    It mentions null as a mistake and then ties it to the word 'null' by referencing that a significant number of people have that last name. As though if it were called something like xkcd that has no pronunciation and is unlikely to be a last name, that would be better.

    I think overall journalism is worse because its perceived as being authoritative. Social media post might be similar level of information, but Wikipedia won't cite it and the laymen realize to take it with a grain of salt. There's also better feedback as the comment section is front and center. Also some person with no knowledge, experience or curiosity in a subject is less likely to comment on it. While a journalist's job is to churn out a wide variety of pieces on topics they're likely unfamiliar with.

  • The issue is that the way it's presented in the article, a layman would interpret it to mean Microsoft don't use null values anywhere in any aspect of their software. Not as "we use them all the time pervasively but would like to do so a bit less", which is what was actually meant.

    We'll have to disagree about the accuracy of legacy vs other forms of media. Many of the best investigations I've ever read have been by independents on Substack, for instance.

> I really don't get why journalists so rarely do basic fact checking

It takes time and effort with no discernable upside. In fact, knowing the true facts would make it harder for journalists to bias the story in the way they want to without them feeling a bit bad for lying. It’s easier for them if they don’t know.

> that undermines their credibility without them even noticing.

Not really. Even the vanishingly small minority of readers who know the details of the story in question suffer from Gell-Mann amnesia, and continue to believe that all other stories (by the same paper, and even from the same reporter) are perfectly accurate.

  • > It takes time and effort with no discernable upside.

    The true answer is incentives, opportunity and aptitude. Incentives are skewed to writing something that will engage readers, the human version of the social media algorithm. Opportunities are short because reporting is done on deadline without the luxury of time to deeply ponder intermediate drafts. And reporters are writing around the edge of their expertise and training all the time. That specific reporter specializes in "personal finance" so it's a wonder the article even begins to make sense. And writers have to be good at two things at the same time, journalism and whatever they are trying to write about. It's hard to be excellent at multiple disciplines.

    When you put it together, it's sort of magical when a news room works at all.

Unfortunately, that seems to be the quality of "professional" journalism nowadays. I wouldn't be surprised if AI was complicit as well (though I don't supposed it'd make a difference as the slop was just as low quality prior to recent years, it may as well have been AI generated then too).

It used to be indie publications, and now I find indie YouTubers tend to be generally superior (though you still have to do your own filtering and selection of course).