← Back to context

Comment by s1mplicissimus

1 day ago

> Ok then why does nearly every company make people write code for interviews or do take home programming projects?

For the same reason they put leetcode problems to "test" an applicants skill. Or have them write mergesort on a chalkboard by hand. It gives them a warm fuzzy feeling in the tummy because now they can say "we did something to check they are competent". Why, you ask? Well it's mostly impossible to come up with a test to verify a competency you don't have yourself. Imagine you can't distinguish red and green, are not aware of it, but want to hire people who can. That's their situation, but they cannot admit it - because it would be clear evidence that they are no good fit for their current role. Use this information responsibly ;)

> Why do people list programming languages on their resumes if it's "least important"?

You put the programming languages in there alongside the HR-soothing stuff because you hope that an actual software person gets to see your resume and gives you an extra vote for being a good match. Notice that most guides recommend a relatively small amount of technical content vs. lots of "using my awesomeness i managed to blafoo the dingleberries in a more efficient manner to earn the company a higher bottom line"

If you don't want to be a software developer that's fine. But your questions point me towards the conclusion that you don't know a lot of things about software development in the first place which doesn't speak for your ability to estimate how easy it will be to automate it using LLMs.

Arguing about programming is not the point, in my opinion.

When AI becomes able to do most non-programming tasks too, say design or solving open-ended problems (yeah, except in trivial cases it cannot -- for now) we can have this conversation again...

I think saying "well, programming is not important, what matters is $THING" is a coping mechanism. Eventually AI will do $THING acceptably enough for the bean counters to push for more layoffs.

  • When AI can do the software engineering tasks that require expertise outside of coding like system design, scoping problems, cross-team/domain work, etc then it will be AGI, at which point the fact that SWE jobs are automated would be the least of everyones worries.

    The main problem I perceive with AI being able to do that kind of work is that it requires an unprecedented level of agency and context-gathering. Right now agents are very much like juniors in that they work in an insular, not collaborative, way.

    Another big problem is that these higher level problems often require piecing together a lot of fragmented context. If the AI already had access to the information, sure, it would probably be able to achieve the task. But the hard bit is finding the information. Some logs here, some code there, a conversation with someone on a different team, etc. It's often a highly intuitive and tacit process, not easily explicitly defined. There's a reason that defining what a "Senior" is tends to be very difficult.

    • > When AI can do the software engineering tasks that require expertise outside of coding like system design, scoping problems, cross-team/domain work, etc then it will be AGI

      I think you're talking about the really general case, but in my opinion that's not as important. All that matters is that AI solutions manage (in the near future) to cover the average case -- where most engineers actually work -- in a mediocre but cost effective manner, for this to have huge repercussions on the job market and salaries.

      > But the hard bit is finding the information. Some logs here, some code there, a conversation with someone on a different team, etc.

      I've no problem believing they will become more and more successful at this. This is information retrieval which can be done faster by machines, and making sense of it all together is where advances in AI will need to happen. I think there's a high chance they'll happen eventually, at least in a way that's enough to cobble together projects that will make the leadership happy (maybe after some review/adjustment by a few human experts they retain?). They do not even have to be particularly successful -- how many human-populated engineering projects succeed, anyway?

  • Also, because the economy is no longer based on competition, but is controlled by a bunch of industry specific oligopolies, even if the bean counters are wrong it won’t matter, because every other company will be similarly inefficient. Everybody loses, but the people in charge are too dumb to know. Our free market is currently broken.