Comment by littlestymaar
6 days ago
> scaffolding code, trivial functions, ... are things that LLMs excel at doing and once you get used to offload those to the LLM it is really hard to get back to doing it manually.
True, but local models cover that use-case very well already and consume little power doing so.
That is a fair point. Honestly, for programmers/technical people I believe you are right and it is trivial already to get started. However for non technical people I believe it is just easier to open claude/gemini/openai web interfaces and chat away. Especially if they have already great integrations (say Google Drive) out of the box.