← Back to context

Comment by echoangle

7 months ago

So serving a document exploiting a browser zero day for RCE under a URL that’s discoverable by crawling (because another page links to it) with the intent to harm the client (by deleting local files for example) would be legitimate because the client made a request? That’s ridiculous.

> because another page links to it

That is not the case in this context. robots.txt is the only thing that specifies the document URL, which it does so in a "disallow" rule. The argument that they did not know the request would be responded to with hostility could be moot in that context (possibly because a "reasonable person" would have chosen not to request the disallowed document but I'm not really familiar with when that language applies).

> by deleting local files for example

This is a qualitatively different example than a zip bomb, as it is clearly destructive in a way that a zip bomb is not. True that a zip bomb could cause damage to a system but it's not a guarantee, while deleting files is necessarily damaging. Worse outcomes from a zip bomb might result in damages worthy of a lawsuit but the presumed intent (and ostensible result) of a zip bomb is to effectively cause the recipient machine to involuntarily shut down, which a court may or may not see as legitimate given the surrounding context.