Comment by JdeBP

9 days ago

My WWW site has been served up by publicfile for many years now, and reading through this I kept having the same reaction, over and over, which is that the assumption that "websites often use reverse proxies" is upgraded in the rest of the article to everyone always uses back-ends and proxies. It's as if there is a monocultural world of HTTP/1.1 WWW servers; and not only does the author discount everything else apart from the monoculture, xe even encourages increasing the monoculture as a survival tactic, only then to state that the monoculture must be killed.

The irony that near the foot of the article it encourages people to "Avoid niche webservers" because "Apache and nginx are lower-risk" is quite strong, given that my publicfile logs show that most of the continual barrage of attacks a public WWW server like mine is subject to are query parameter injection attempts, and attacks quite evidently directed against WordPress, Apache, AWS, and these claimed "lower risk" softwares. (There was another lengthy probe to find out where WordPress was installed a couple of minutes ago, as I write this. Moreover, the attacker who has apparently sorted every potentially vulnerable PHP script into alphabetical order and just runs through them must be unwittingly helping security people, I would have thought. (-:)

Switching from my so-called "niche webserver", which does not have these mechanisms to be exploited, to Apache and nginx would be a major retrograde step. Not least because djbwares publicfile nowadays rejects HTTP/0.9 and HTTP/1.0 by default, and I would be going back to accepting them, were I foolish enough to take this paper's advice.

"Reject requests that have a body" might have been the one bit of applicable good advice that the paper has, back in October 1999. But then publicfile came along, in November, whose manual has from the start pointed out (https://cr.yp.to/publicfile/httpd.html) that publicfile httpd rejects requests that have content lengths or transfer encodings. It's a quarter of a century late to be handing out that advice as if it were a new security idea.

And the whole idea that this is "niche webservers" is a bit suspect. I publish a consolidated djbwares that incorporates publicfile. But the world has quite a few other cut down versions (dropping ftpd being a popular choice), homages that are "inspired by publicfile" but not written in C, and outright repackagings of the still-available original. It's perhaps not as niche as one might believe by only looking at a single variant.

I might be in the vanguard in the publicfile universe of making HTTP/0.9 and HTTP/1.0 not available in the default configuration, although there is a very quiet avalanche of that happening elsewhere. I'm certainly not persuaded by this paper, though, based entirely upon a worldview, that publicfile is direct evidence of not being universal truth, to consider that I need do anything at all about HTTP/1.1. I have no back-end servers, no reverse proxies, no CGI, no PHP, no WordPress, no acceptance of requests with bodies, and no vulnerability to these "desync" problems that are purportedly the reason that I should switch over to the monoculture and then switch again because the monoculture "must die".