Comment by tosmatos

19 days ago

I agree with this.

I've used AI as a crutch for a time, and felt my skills get worse. Now I've set it up to never have it give me entire solutions, just examples and tips on how to get it done.

I've struggled with Shader Programming for a while, tried to learn it from different sources and failed a lot. It felt like something unreachable for me, I don't really know why really. But with the help of an AI that's fine-tuned for mentoring, I really understood some of the concepts. It outlined what I should do and asked socratic questions that made me think. I've gotten way better at it and actually have a pretty solid understanding of the concepts now (well, I think).

But sometimes at work I do give in and get it to write an entire script for me, out of laziness and maybe boredom. Their significant advances as of late with "extended thinking" and the likes made them much more likely to one-shot the writing of a slightly complex script... Which in turn made it harder to not just say "hey, that sounds like boring work, let's have the AI do the biggest part of it and I'll patch up the rest".

I have a similar setup going on. I'm a heavy user of LLMs, but the only time I use the code they generate is for throwaway scripts. I like to describe the problem I'm working on, paste in my code, and ask about everything wrong with it. Am I missing something? Are there glaring flaws or inefficiencies? Are there better ways to approach this? I never take suggestions unless I fully understand and agree with them. There are lots of poor suggestions, but lots of really good ones too.

Infinite tailored critique and advice. I have found this immensely valuable, and I have learned lots doing it. LLMs are static analyzers on steroids.