Comment by vendiddy

19 days ago

I think AI will have a dual effect. It will make some folks smarter and others dumber.

For example, you could have ChatGPT write your code for you, then explain it to you step by step.

It can be an interactive conversation.

Or you could copy/paste it.

In one case it acts as a tutor.

In another case it just does your work for you.

I agree with this.

I've used AI as a crutch for a time, and felt my skills get worse. Now I've set it up to never have it give me entire solutions, just examples and tips on how to get it done.

I've struggled with Shader Programming for a while, tried to learn it from different sources and failed a lot. It felt like something unreachable for me, I don't really know why really. But with the help of an AI that's fine-tuned for mentoring, I really understood some of the concepts. It outlined what I should do and asked socratic questions that made me think. I've gotten way better at it and actually have a pretty solid understanding of the concepts now (well, I think).

But sometimes at work I do give in and get it to write an entire script for me, out of laziness and maybe boredom. Their significant advances as of late with "extended thinking" and the likes made them much more likely to one-shot the writing of a slightly complex script... Which in turn made it harder to not just say "hey, that sounds like boring work, let's have the AI do the biggest part of it and I'll patch up the rest".

  • I have a similar setup going on. I'm a heavy user of LLMs, but the only time I use the code they generate is for throwaway scripts. I like to describe the problem I'm working on, paste in my code, and ask about everything wrong with it. Am I missing something? Are there glaring flaws or inefficiencies? Are there better ways to approach this? I never take suggestions unless I fully understand and agree with them. There are lots of poor suggestions, but lots of really good ones too.

    Infinite tailored critique and advice. I have found this immensely valuable, and I have learned lots doing it. LLMs are static analyzers on steroids.

It is one thing to get code explained to you (which can also be good) but another to engage in finding a solution, explore the problem space, fail a couple of times and learn from your mistakes also, and of course the embodied process itself of writing the code. Learning is an active process; having stuff explained to you is not bad but it does not lead to the same depth of understanding. Granted, not all subjects and cases benefit the same from deeper understanding and it is impossible to get into depth with everything. So this is a trade-off in each case to decide how much one may want to go in, and it is great that we also now have this option to not go in the same depth. But imo one should be mindful about it, and make conscious decisions on how they use LLMs in case where they may think that understanding a subject more is also important.

There are still ways that LLMs can be used in that case, eg having them review your code, suggest alternatives to your code, eg more idiomatic ways to do sth, when you delve into sth new etc, and treat their output critically of course, but actually writing one's code is important for some kinds of understanding.

> In one case it acts as a tutor

This can be very useful when you are learning programming.

You don't always have a tutor available and you shouldn't only rely on tutors.

It might be useful when you start learning a new programming language/framework, but you should learn on how to articulate a problem and search for solutions, e.g. going through stackoverflow posts and identify if the post applies and solves your problem.

After a while (took way too long for me) you realize that the best way to solve problems is by looking up the documentation/manpage of a project/programming language/whatever and really try to understand the problem at its core.

I wonder how much even this approach would help. I would liken it to studying past exam papers with the solutions on hand. My experience is you actually have to solve the problems yourself to actually properly absorb the concepts, rather than just copy them into your short term memory for a short while.

Ai will make experts more effective and remove most people who are going to grow into experts.

Basically most people will be idiots, except for the mental exercise type people who like using their mental muscles.

So education will stop being a way to move up in life.

I agree - the truly curious will be rewarded while those who couldn’t care less will mindlessly copy and paste. Maybe that will give the rest of us job security?

It's just Google (web search) v2, if you are able to input the right terms and interpret the results critically you'll be accelerated. If not, you're just another mark.

  • Also there's no context or docs to dig into, it just spits something out that looks right but might be relying on deprecated code or completely wrong.

    Ask it to explain something? At least it's confident I guess.