Comment by dleeftink
1 day ago
Not saying this would be the right way to go about preventing undesirable uses, but shouldn't building 'risky' technologies signal some risk to the ones developing them? Safe harbor clauses have long allowed the risks to be externalised onto the user, fostering non-responsibility on the developers behalf.
Foisting the responsibility of the extremely risky transport industry onto the road developers would certainly prevent all undesirable uses of those carriageways. Once they are at last responsible for the risky uses of their technology, like bank robberies and car crashes, the incentive to build these dangerous freeways evaporates.
I think this is meant to show that moving the responsibility this way would be absurd because we don't do it for cars but... yeah, we probably should've done that for cars? Maybe then we'd have safe roads that don't encourage reckless driving.
But I think you're missing their "like bank robberies" point. Punishing the avenue of transport for illegal activity that's unrelated to the transport itself is problematic. I.e. people that are driving safely, but using the roads to carry out bad non-driving-related activities.
It's a stretched metaphor at this point, but I hope that makes sense (:
2 replies →
We wouldn't have roads at all is my point, because no contractor in their right mind would take on unbounded risk for limited gain.
8 replies →
> then we'd have safe roads that don't encourage reckless driving.
You mean like speed limits, drivers licenses, seat belts, vehicle fitness and specific police for the roads?
I still can't see a legitimate use for anyone cloning anyone else's voice. Yes, satire and fun, but also a bunch of malicious uses as well. The same goes with non-fingerprinted video gen. Its already having a corrosive effect on public trust. Great memes, don't get me wrong, but I'm not sure thats worth it.
3 replies →
And I am talking about user-facing app development specifically, which has a different risk profile compared to automative or civil engineering.
Well it would also apply to bike lanes.
How can you know how people are going to use the stuff you make? This is how we end up in a world where a precondition to writing code is having lawyers on staff.
No.
The reason safe harbor clauses externalize risks onto the user is because the user gets the most use (heh) of the software.
No developer is going to accept unbounded risk based on user behavior for a limited reward, especially not if they're working for free.
The reason safe harbor clauses exist is because you don't blame the car manufacturer for making the bank robbery get away car.
Just last weekend I developed a faster reed-solomon encoder. I'm looking forward to my jail time when somebody uses it to cheaply and reliably persist bootlegged Disney assets, just because I had the gall to optimize some GF256 math.
That is not what I said. It is about signalling risks to developers, not criminalising them. And in terms of encoders, I would say it relates more to digital 'form' than 'content' anyways, the container of a creative work vs the 'creative' (created) work itself.
While both can be misused, to me the latter category seems to afford a far larger set of tertiary/unintended uses.
No