Comment by ImPostingOnHN

2 days ago

> AI should not have done that operation when they have explicit rules not to.

How much experience do you have with LLMs?

One of the first lessons developers learn after working with LLMs a bit, is that the LLM will hallucinate, and you need to be alert and competent enough to recognize when it happens. Sort of like a car with steering assist requires you to pay attention and take personal responsibility for anything that happens.

As a consequence of that, one of the second lessons developers learn after working with LLMs a bit, is that there is no such thing as "an explicit rule" for LLMs. "Explicit rules" can still be ignored by an LLM under many different circumstances. The sooner the developer learns this fact, the sooner they can be productive with LLMs, and the less likely they are to delete their own production database and blame it on their tools with which they're unfamiliar.