Comment by nyrikki
2 days ago
It is really the only safe way to use it IMHO.
Even in most simple forms of automation, humans suffer from Automation Bias and Complacency and one of the better ways to avoid those issues is to instill a fundamental mistrust of those systems.
IMHO it is important to look at other fields and the human factors studies to understand this.
As an example ABS was originally sold as a technology that would help you 'stop faster'. Which it may do in some situations, and it is obviously mandatory in the US. But they had to shift how they 'sell' it now, to ensure that people didn't rely on it.
https://www.fmcsa.dot.gov/sites/fmcsa.dot.gov/files/docs/200...
2.18 – Antilock Braking Systems (ABS)
ABS is a computerized system that keeps your wheels from locking up during hard brake applications.
ABS is an addition to your normal brakes. It does not decrease or increase your normal braking capability. ABS only activates when wheels are about to lock up.
ABS does not necessarily shorten your stopping distance, but it does help you keep the vehicle under control during hard braking.
Transformers will always produce code that doesn't work, it doesn't matter if that is due to what they call hallucinations, Rice's theory, etc...
Maintaining that mistrust is the mark of someone who understands and can leverage the technology. It is just yet another context specific tradeoff analysis that we will need to assess.
I think forcing people into the quasi-TDD thinking model, where they focus on what needs to be done first vs jumping into the implementation details will probably be a positive thing for the industry, no matter where on the spectrum LLM coding assistants arrive.
That is one of the hardest things to teach when trying to introduce TDD, focusing on what is far closer to an ADT than implementation specific unit tests to begin with is very different but very useful.
I am hopeful that required tacit experience will help get past the issues with formal frameworks that run into many barriers that block teaching that one skill.
As LLM's failure mode is Always Confident, Often Competent, and Inevitably Wrong, it is super critical to always realize the third option is likely and that you are the expert.
No comments yet
Contribute on Hacker News ↗