> Like what? There's no technology that simply by existing causes harm to the world, people do that part.
People create that technology, therefore enforcing their own lack of morals and lack of ethics onto it. That's the part that most humans in the post-digital age seem to ignore to purposefully deflect and absolve themselves from any responsibilities.
Also, companies will always be controlled by humans that optimized their life for greed, not by the ones that specialized on philosophical implications.
The inventors of novichok or the nuclear bomb didn't have "world peace" in mind. They had "world peace through me enforcing my own will onto my enemies" in mind.
like that chemical weapon that was specifically designed to react with gas mask absorbent materials so to activate at the protected side so to circumvent filteration(long banned since the end of WWI).
I hate to admit it, but it's true. Technology is amoral and neutral rather than morally directed, it can directed towards profits, control, and nefarious goals, sure. It's the added externalities in the form of lost jobs and suffering borne by many and power gained by a few that technological advancement enables. The decision to or how to use technology by human decisions is where a moral crossroad exists and is considered or ignored by the stakeholders involved. Substantive engineering ethics isn't much of a thing anymore as long the TC is enough, but performative complaints about napkins not being 100% recycled or insufficient planted trees are the ostensible substitutes.
That is simply not true. There is lots of bad technology.
who gets to decide which technology must be banned? the same people who decide which books must be burned?
Surely that would be you.
Like what? There's no technology that simply by existing causes harm to the world, people do that part.
> Like what? There's no technology that simply by existing causes harm to the world, people do that part.
People create that technology, therefore enforcing their own lack of morals and lack of ethics onto it. That's the part that most humans in the post-digital age seem to ignore to purposefully deflect and absolve themselves from any responsibilities.
Also, companies will always be controlled by humans that optimized their life for greed, not by the ones that specialized on philosophical implications.
The inventors of novichok or the nuclear bomb didn't have "world peace" in mind. They had "world peace through me enforcing my own will onto my enemies" in mind.
1 reply →
like that chemical weapon that was specifically designed to react with gas mask absorbent materials so to activate at the protected side so to circumvent filteration(long banned since the end of WWI).
> There is no such thing as bad technology.
If nothing else, it's a debate where we'd need to define our terms.
I hate to admit it, but it's true. Technology is amoral and neutral rather than morally directed, it can directed towards profits, control, and nefarious goals, sure. It's the added externalities in the form of lost jobs and suffering borne by many and power gained by a few that technological advancement enables. The decision to or how to use technology by human decisions is where a moral crossroad exists and is considered or ignored by the stakeholders involved. Substantive engineering ethics isn't much of a thing anymore as long the TC is enough, but performative complaints about napkins not being 100% recycled or insufficient planted trees are the ostensible substitutes.