Comment by brookst

7 days ago

Definitely a good way to drive talent overseas. Get the low level people to assume all of the risk with none of the upsides; ask recent grades and junior people to do E2E ethical analysis on every project in addition to their 60 hour/week job, give the truly evil people convenient, lower-level scapegoats.

Completely agree.

My feeling is that corporate officers should bear the burden that the corporation as a person currently bears. I can only imagine how much better things would be in past experiences if the C-levels felt a personal need to actually know how the sausage is being made.

  • I can't fully agree because the way I see it, that is in a way scapegoating the company executives. Are they responsible? Probably, yes, they set the direction of the company and give the orders at the highest level. But we the engineers and designers are the ones actually implementing what is probably a fairly nebulous order at the highest levels into something concrete. They deign that there should be evil created, but we're the ones who are actually making it happen.

    Some of the responsibility lies with us, and we need to not pretend that's not the case.

    • Do you also take personal responsibility for your company’s hiring practices, investment strategy, and marketing content? None of that would exist without you.

      I think anyone would agree that there’s a level of flagrantly where individuals should feel culpability and make the right choices (“write software to prescribe poison to groups we don’t like”).

      But something like this? Two apps establishing a comms channel? How many millions of times does this get done per year with no ill intent or effect? Is every engineer supposed to demand to know l of the use cases, and cross reference to other projects they’re not working on?

      At some point it’s only fair to say that individuals should exercise their conscience when they have enough information, but it is not incumbent on every engineer to demand justification for every project. That’s where the decision makers who do have the time, resources, and chatter to know better should be taking at least legal responsibility.

      2 replies →

    • I'd agree at a personal/moral level there is equal responsibility. However that doesn't recognise both the power and risk/reward imbalance here.

      If you, as an employee did this - maybe you'd add a few dollars to your stock options over time. If your Zuck - that's potentially billions.

      And in terms of downside - if you are Zuck and stop it in the company - there is no comeback - if you are an engineer blowing the whistle - you may find it hard to work in the industry ever again - and only one of those two actually needs to work.

      9 replies →

    • > I can't fully agree because the way I see it, that is in a way scapegoating the company executives.

      Frankly, that's what the money's for.

You don't need to invest significant time to realize that working around privacy restrictions is wrong and you shouldn't do it.

  • Have you worked in software? This is a complex, multi-application system with IPC. Most of the people implementing it probably had no idea what the partner applications were, let alone the business intent.

    Nobody sits down with a mid-level developer and says “we need your native app to receive webrtc connections that will be used to send app-layer telemetry that circumvents privacy protections”. The requirement is just to receive events and log them. And odds are there were all sorts of harmless events as well.

    At the level where people had a holistic view of the system and intent, sure, throw them in jail. I’d guess that’s about 1% of the people who designed, implemented, tested, documented this code.