← Back to context

Comment by tptacek

15 hours ago

Yeah, I used to be skeptical of the government provenance of things like Stuxnet (I am not any more, I'm fully sold, like everyone else), and notes like this were why. People used RCS well into the 2000s! RCS as a tool had virtues over SVN and CVS.

My favorite part of the paper is that the “attack” isn’t just exploiting a bug — it’s exploiting how different components interpret the same input. Modifying an executable as it’s loaded into memory is one example, but the deeper pattern is the mismatch.

What’s interesting about the malware in this post is that it goes one step further: instead of exploiting mismatches, it corrupts the computation itself — so every infected system agrees on the same wrong answer!

More broadly: any interpretive mismatch between components creates a failure surface. Sometimes it shows up as a bug, sometimes as an exploit primitive, sometimes as a testing blind spot. You see it everywhere — this paper, IDS vs OS, proxies vs backends, test vs prod, and now LLMs vs “guardrails.”

Fun HN moment for me: as I was about to post this, I noticed a reply from @tptacek himself. His 1998 paper with Newsham (IDS vs OS mismatches) was my first exposure to this idea — and in hindsight it nudged me toward infosec, the Atlanta scene, spam filtering (PG's bayesian stuff) and eventually YC.

https://users.ece.cmu.edu/~adrian/731-sp04/readings/Ptacek-N...

The paper starts with this Einstein quote "Not everything that is counted counts and not everything that counts can be counted", which seems quite apt for the malware analyzed here :)

  • Just curious, are you purposely mocking the LLM writing style?

    • That’s how everybody in academia, tech, and published authors in general used to write.

      Where do you think the LLM is getting it from? ^_^

> used to be skeptical of the government provenance

Do you mean skeptical on which government was responsible or that it was in fact a government effort?

I can see how attribution could be debatable (between two main suspects mainly), but are / were there any good arguments against this being a gov effort? I would find it highly unlikely that someone other than a gov could muster up so much domain knowledge, source pristine 0days and be so stealthy at the same time.

I do wonder if these breadcrumbs were also left intentionally. “Oh look, we are using old stuff, don’t be afraid!” Or for some other reason. It is a little surprising to pull off such a sophisticated attack and miss details you could find running ‘strings’ unless I’m missing something and this part was encrypted.

  • I think that in the time period we're talking about, RCS wasn't really even all that old. Like, RCS is old, sure, but it was also in common use especially by Unix systems people; it's what you might have reached for by default to version your dotfiles, for instance.

    • Yes, but even back then I was aware of the sections in executables (wasn’t this where it was found?) and any neckbeard from the 70s and 80s might be even more so aware. That said, yeah, sure, it’s a very possible and understandable oversight, but I’m weary because of all the text in viruses and such as indicators. Seems like a pass over ‘strings’ would be obvious. Though. TIL, strings doesn’t necessarily scan the entire executable.

      1 reply →

> People used RCS well into the 2000s!

I still use RCS today. It's certainly not my preferred option, but my collaborator likes it, and it's not too annoying for me to use.