Comment by bryan_w

13 hours ago

I used to work for an ad tech company (which I know already makes me the devil to some around here), and even I think that they crossed a line with this. A lot of industry terms are coded in corporate speak to make them sound better (think "revealed preferences" or "enabling personalization"), but I would genuinely like to know what the engineers thought when doing design reviews for a "selective stand down" feature. There doesn't seem to be a legit way to spin it.

Making a product to explicitly skirt agreements while working for a corporation is ... a choice

> what the engineers thought when doing design reviews for a "selective stand down" feature.

Possibly a version of, “I lack the freedom to operate with a moral code at work because I’m probably replaceable, the job market makes me anxious, my family’s well-being and healthcare are tied to having a job, and I don’t believe the government has my back.”

  • From my experience, it’s more likely that the engineers who got far enough in the company to be working on this code believed that their willingness to work on nefarious tasks that others might refuse or whistle-blow made them a trusted asset within the company.

    In industries like this there’s also a mindset of “Who cares, it’s all going to corporations anyway, why not send some of that money to the corporation that writes my paychecks?”

    • I have noticed that in addition to this perspective there are scores of developers who espouse the idea that “we just create, what people do with our work isn’t our business.”

      I understand the utilitarian qualities of the argument, but I submit that there’s a reason that capital-E-Engineering credentials typically require some kind of education in ethics-in-design.

    • I suspect you are right. It reminds me of the whole "at the government you can hack legally" argument used by government intelligence agencies to recruit hackers.

      I think a lot of skilled engineers want interesting challenges where they break boundaries, and being in an environment that wants you to break those boundaries allows them to legitimize why they are doing it. That is, "someone else is taking moral responsibility, so I can do my technical challenge in peace"

    • Do you know of anyone declining to work on a project For ethical in their view ( non military non killing) ?

      I’ve led a sheltered life and never met one, people have told me they wouldn’t apply for a role with a company for ethical reasons maybe they even believed they would get the job

      12 replies →

  • I like the idea that what makes someone a 'professional' instead of just an employee is the wherewithal, agency, and expectation to say no to a particular task or assignment.

    An architect or engineer is expected to signal and object to an unsafe design, and is expected by their profession (peers, clients, future employers) to refuse said work even if it costs them their job. This applies even to professions without a formalized license board.

    If you don't have the guts and ability to act ethically (and your field will let you get away with it), you're just a code monkey and not a professional software developer.

    • Maybe when the government and the shareholders start setting an example and hold the bosses and capital owners accountable, and reward instead of punish the whistleblowers, and when their are enough jobs so that losing the one you have is not a problem, moral behavior further down the hierarchy will improve.

  • Those poor guards working in the concentration camps in nazi germany just wanted job security. They can’t be blamed for their actions.

  • In my experience, sometimes your employer blatantly lies to you about what you're making and how it'll be used. I was once recruited to work on a software installer which could build and sign dynamic collections of software which was meant to be used to conveniently install several packages at once. Like, here's a set of handy tools for X task, here are the default apps we install on machines for QA people, here is our suite of apps for whatever. It seemed to have genuine utility because it could pull data in real time to ensure it was all patched and current and so on. That could be great for getting new machines up and running quickly. Several options exist for this use case today, but didn't then as far as I recall. This was on Windows.

    Ultimately it was only used to install malware in the form of browser extensions, typically disguised as an installer for some useful piece of software like Adobe Acrobat. It would guide you through installing some 500 year old version of Acrobat and sneakily unload the rest of the garbage for which we would be paid, I don't know, 25 cents to a couple dollars per install. Sneaking Chrome onto people's machines was great money for a while. At one point we were running numbers of around $150k CAD per day just dumping trash into unsuspecting people's computers.

    At no point in the development of that technology were we told it was going to ruin countless thousands of people's browsers or internet experiences in general. For quite a while the CEO played a game with me where I'd find bad actors on the network and report them to him. He'd thank me and assure me they were on top of figuring out who was behind it. Eventually I figured out that the accounts were in fact his. They let me go shortly after that with generous severance.

    I don't miss anything about ad tech. It was such a disheartening introduction to the software world. It's really the armpit and asshole of tech, all at once.

    • > Ultimately it was only used to install malware in the form of browser extensions, ...

      Like any other MDM software.[0] Everyone who has been long enough in the infosec industry knows that MDM is fundamentally nothing more than a corporate-blessed malware and spyware package.

      In the past 2-3 years the criminal gangs have realised that too. The modern form of socially engineered phishing quite often entices victims to install a legit MDM software package (eg. MS InTune) and hand over their device control for remote management. Why bother writing malware that has to fiddle with hooks to syscalls and screenshot capabilities when you have a vendor approved way of doing the same?

      0: https://en.wikipedia.org/wiki/Mobile_device_management

  • I think you can only get away with that excuse so long as you're actively looking for a new job while also collecting data to turn whistleblower (anonymously if need be) once you have one. Ultimately it falls on the employee to do the right thing or get out because they risk being held accountable for what they do. A replaceable employee (which is pretty much all of them) will be especially vulnerable since they can be thrown under the bus with minimal inconvenience to the company.

  • Ah yes let's be sure not to judge anyone for anything they do

    • You can still judge them evil even if the parent was accurate as to the motivations for their actions. Villains are more interesting when they're sympathetic.

      You're in the planning meeting discussing this feature, you ask "Hey, are we allowed to do this? I thought stand downs were contractural." and your PM says yes, they got the okay from legal. Now what do you do?

      2 replies →

A nice set of examples can be found in Guido Palazzo's Dark Pattern.

“The Dark Pattern by Guido Palazzo and Ulrich Hoffrage teaches us about the power of context, which is stronger than reason, values, morals, and best intentions. It is an uncomfortable and painful lesson about the root causes of 'corporate infernos.' "

The context matters.

Think of the banality of evil in WW2 Germany.

We are capable of doing almost anything, good or bad, as long as the shoal around does it and pretends it normal.

Ethically bankrupt software engineer startled that others aren’t holding the line of civilisation for them.

This is no different, and frankly far less alarming to me, than Uber's project greyball from 2017, which should have tanked a company in a just world. I suppose some companies just promulgate a culture where its acceptable or even lauded to evade law and contracts: https://www.nytimes.com/2017/03/03/technology/uber-greyball-...

  • You are right, but it's just a whataboutism argument, isn't it? There are lots of other evils by other businesses; why are they relevant here?

    • This comment was replying to someone asking "how could engineers possibly write such malicious code" so a more glaring example from a more mainstream company seemed quite appropriate.

> but I would genuinely like to know what the engineers thought when doing design reviews for a "selective stand down" feature.

First comes a full stomach, then comes ethics.

> I used to work for an ad tech company (which I know already makes me the devil to some around here)

Yes, thank you for making the web objectively worse for everyone. Yo should feel bad.

Possibly "marketing is all bullshit and hopefully this destroys it faster"

It's not like any crime was committed, and civil liability falls squarely on the business here, not its employees. And the whole dispute is only about which marketing company receives marketing revenue - something where the world would improve if they all disappeared overnight. Doesn't really seem that evil to me. Underhanded, yes.

I think the only reason there's any outrage at all, outside the affiliate marketing "industry", is that some of these marketing companies are YouTube personalities with whom many people have parasocial relationships. Guess what, they just got to learn the hard way why capitalism sucks. What Honey did is a valid move in the game of business. Businesses throughout history have gained success by doing way worse things than this. Amazon's MFN clause is way worse. Uber's Greyball is way worse.