Comment by billmalarky
12 hours ago
Yes -- definitely that's the value prop. But it's not binary all or nothing.
AI automation is about trust (honestly, same as human delegation).
You give it access to a little bit of data, just enough to do a basic useful thing or two, then you give it a bit of responsibility.
Then as you build confidence and trust, you give it a little more access, and allow it to take on a little more responsibility. Naturally, if it blows up in your face, you dial back access and responsibility quick.
As an analogy, folks drive their cars on the highway at 65-85+ MPH. Fatality rate goes up somewhat exponentially with speed and anything 60+ is considerably more deadly than ~30mph.
We're all so confident that a wheel won't randomly fall off because we've built so much trust with the quality of modern automobiles. But it does happen (I had a friend in high-school who's wheel popped off on a 45 mph road -- naturally he was going 50-55 IIRC).
In the early 1900s people would have thought you had a death wish to drive this fast. 25-30mph was normal then -- the automobiles at the time just weren't developed enough to be trusted at higher speeds.
My previous comment was about the fact that it is possible to build this sandboxing/bastion layer with live web accounts that allows for fine grained control over how much data you want to expose to the ai.
No comments yet
Contribute on Hacker News ↗