Comment by lucianbr
3 days ago
That's how AI is supposed to be used, no? That's what the providers advertise - it increases development speed, a lot, it replaces devs and so on.
But I guess it's only ok when you work on regular joe facing projects, where the consequences of bugs are on powerless users. If the consequences are on Google, well, that's not acceptable now is it?
The consequences for Google are that the people are misusing the keys and the Google is fixing that. They're not banning anybody using proper API keys
> using AI for vibes is a fast track to bugs and security incidents
Yes, that's what he said.
[flagged]
A human is not punished, the access of the robot to the API is restricted. The human has not suffered any damage.
2 replies →
You’re responsible for the things your AI agent does.
2 replies →
> That's how AI is supposed to be used, no? That's what the providers advertise - it increases development speed, a lot, it replaces devs and so on.
Not really. There’s a difference between accelerating development in the hands of an experienced developer versus having somebody just slop code by hoping for the best.
Adopting AI doesn’t equal removing code review. These were two separate choices combined.
> https://blog.samaltman.com/the-gentle-singularity
Search for "review": 0 matches.
Of course the fine print says to review, just like the ultimate control of the "full self driving" rests with the human driver. But why is the fine print fine, and not large as the large print? Maybe because you're not supposed to pay attention to it? Could this be?