Comment by colechristensen

3 months ago

It's DARPA, you're really past the moralizing about war stage here, that's just out of context. I don't see UX experts hand-wringing about the effects of advertising when they're designing their products.

>discourse around wellness, anxiety, and mental health in such a banal manner

It's not about "feelings" and that might disturb you, but really very many things should be much less about feelings. A whole lot of "wellness, anxiety, and mental health" isn't about feelings but instead being inside or outside the limits of what a person is capable of handling. Facts-based analysis of work and life and people being too far outside their comfort zone could do a lot for many people dealing with mental health issues.

DARPA does and obviously _needs to_ study these things. One of the most important areas for this are pilots especially during emergencies. It comes from both directions, designing the machine to be manageable and training the human to manage in exceptional circumstances and _knowing the limits_ of both.

> It's DARPA, you're really past the moralizing about war stage here, that's just out of context.

I don’t really think I was moralizing… just commenting on the funny juxtaposition of the language and the context - or on the comedy of the language specifically when not considering the whole context. I was not saying DARPA should or should not be doing this - though I’ll grant that what I wrote could be read as an implicit criticism, even though it was not my intention.

> I don't see UX experts hand-wringing about the effects of advertising when they're designing their products.

Plenty do. Plenty don’t. Similarly, plenty of machine learning engineers might choose not to work on, say, a predictive algorithm for facial recognition or a product recommender system because they don’t feel like being a part of that system. Some people don’t have that luxury, or don’t care. It’s fine either way, though I of course encourage anyone to do some reflection on the social implications of their engineering projects from time to time. Hamming, who worked on everything from the ABomb to telephones to the foundations of computer programming (and everything in between) strongly recommends this, and I agree. Working on weapons might be necessary, but you still need to reflect and make a conscious decision about it.

> It's not about "feelings" […] It comes from both directions, designing the machine to be manageable and training the human to manage in exceptional circumstances and _knowing the limits_ of both.

Of course, totally understand that. That doesn’t mean we can’t find humor in decontextualizing the language! Or in thinking about how science always must struggle with euphemism for the purposes of concision.