← Back to context

Comment by WrtCdEvrydy

4 years ago

I just want you to know that this is extremely unethical to create a paper where you attempt to discredit others by just using your university's reputation to try to create vulnerabilities on purpose.

I back your decision and fuck these people. I will additionally be sending a strongly worded email to this person, their advisor and their whoever's in charge of this joke of a computer science school. Sometimes I wish we had the ABA equivalent for computer science.

Please don't fulminate on HN (see https://news.ycombinator.com/item?id=26889743.

  • Are you serious? If I publish a paper on the social engineering vulnerabilities we have used over the last three months to gain access to your password and attempt to take over Hacker News, you would be fine with it? No outburst, no angrily banning my account...

    • It seems odd that you are responding with a threat, or at least a threatening hypothetical to a (the?) moderator.

      The way I understand it is that unnecessarily angry or confrontational posts tend to lower the overall tone. They are cathartic/fun to write, fast to write, and tend to get wide overall agreement/votes. So if they are allowed then most of the discussion on a topic gets pushed down beneath that sort of post.

      Hence why we are asked to refrain, to permit more room for focused and substantive discussion.

      5 replies →

just write to irb@umn.edu and ask if this was a) reviewed and b) who approved it. It seems they have anyway violated the Human Research Protection Program Plan.

The researchers should not have done this, but ultimately it's the faculty that must be held accountable for allowing this to happen in the first place. They are a for-profit institution and should not get away with harassing people who are contributing their personal time. So nail them to the proverbial cross but make sure the message is heard by those who slipped up (not the researchers who should have been stopped before it happened).

I completely disagree with this framing.

A real malicious actor is going to be planted in some reputable institution, creating errors that look like honest mistakes.

How do you test if the process catches such vulnerabilities? You do it the just the way that these researchers did.

Yes, it creates extra homework for some people with certain responsibilities, that doesn't mean it's unethical. Don't shoot the messenger.

  • > A real malicious actor

    They introduced a real vulnerability in a codebase that lowers world-wide cybersecurity used by billions so they could jerk themselves off over a research paper.

    They are a real malicious actor and I hope they hit by the CFAA.

    • There is a specific subsection of the CFAA that applies to this situation (deployment of unauthorized code that makes its way into non consenting systems).

      This was a bold and unwise exercise, especially if you’re an academic in country on a revocable visa who participated.

      2 replies →

  • No. There are processes to do such sorts of penetration testing. Randomly sending buggy commits or commits with security vulns to "test the process" is extremely unethical. The linux kernel team are not lab rats.

    • It's not simply unethical, it's a national security risk. Is there a proof that the Chinese government was not sponsoring this ,,research '' for example?

      9 replies →

    • > There are processes to do such sorts of penetration testing.

      What's the process then? I doubt there is such a process for the Linux kernel, otherwise the response would've been "you did not follow the process" instead of "we don't like what you did there".

      8 replies →

  • These are real malicious actors.

    • You don't know that, but that's also irrelevant. There's always plausible deniability with such bugs. The point is that you need to catch the errors no matter where they come from, because you can't trust anyone.

      3 replies →

  • It is unethical. You cannot experiment on people without their consent. Their own university has explicit rules against this.