← Back to context

Comment by incrudible

4 years ago

That's not really testing the process, because now you have introduced bias. Once you know there's a bug in there, you can't just act as if you didn't know.

I guess you could receive "authorization" from a confidante who then delegates the work to unwitting reviewers, but then you could make the same "ethical" argument.

Again, from a hacker ethos perspective, none of this was unethical. From a "research ethics committee", maybe it was unethical, but that's not the standard I want applied to the Linux kernel.

> from a hacker ethos perspective, none of this was unethical.

It totally is if your goal as a hacker is generating a better outcome for security. Read the paper, see what they actually did, they just jerked themselves off over how they were better than the open source community, and generated a sum total of zero helpful recommendations.

So they subverted a process, introduced a Use After vulnerability and didn't do jack shit to improve it.

  • > It totally is if your goal as a hacker is generating a better outcome for security. Read the paper, see what they actually did, they just jerked themselves off over how they were better than the open source community, and generated a sum total of zero helpful recommendations.

    The beauty of it is that by "jerking themselves off", they are generating a better outcome for security. In spirit, this reaction of the kernel team is not that different from Microsoft attempting to bring asshole hacker kids behind bars for exposing them. When Microsoft realized that this didn't magically make Windows more secure, they fixed the actual problems. Windows security was a joke in the early 2000s, now it's arguably better than Linux. Why? Because those asshole hacker kids actually changed the process.

    > So they subverted a process, introduced a Use After vulnerability and didn't do jack shit to improve it.

    The value added here is to show that the process could be subverted, the lessons are to be learned by someone else.

    • > is to show that the process could be subverted, the lessons are to be learned by someone else.

      If you show up to a kernel developer's house, put a gun to their head and tell them to approve the PR, that process can also be subverted...

      3 replies →

This is the sort of situation where the best you could do is likely to be slightly misleading about the purpose of the experiment. So you'd lead off with "we're interested in conducting a study on the effectiveness of the Linux code review processes", and then use patches that have a mix of no issues, issues only with the Linux coding style (things go in the wrong place, etc.), only security issues, and both.

But at the end of the day, sometimes there's just no way to do ethically do the experiment you want to do, and the right solution to that is to just live with being unable to do certain experiments.