← Back to context

Comment by bigstrat2003

11 hours ago

> A lot of software has been squandering the massive hardware gains that have been made. I hope this changes when it becomes a lot harder to throw hardware at the problem.

Considering how many people are so averse to programming that they use LLMs to generate code for them? Not very likely IMO. I would like to see it happen, but people seem allergic to actually trying to be good at the craft these days.

I think we aren't far from AI being able to solve this sort of problem too.

Imagine you are Apple and can just set an LLM loose on the codebase for a weekend with the task to reduce RAM usage of every component by 50%...

  • From everything I’ve seen, LLMs aren’t exactly known for writing extremely optimized code.

    Also, what happens to the stability and security of my phone after they let an LLM loose on the entire code base for a weekend?

    There are 1.5 billion iPhones out there. It’s not a place to play fast and loose with bleeding edge tech known for hallucinations and poor architecture.

    • > LLMs aren’t exactly known for writing extremely optimized code.

      They are trained on everything, and as a result write code like the Internet average developer.

    • If you ask an LLM to code whatever, it definitely won’t produce optimized code.

      If you direct it to do a specific task to find memory and cpu optimization points, based on perf metrics, then it’s a completely different world.

      2 replies →

    • They kind do if you prompt them, I had mine reimplement the Windows calc (almost fully feature complete) in rust running with 2mb RAM instead of 40mb or whatever the win 11 version uses as a POC.

      A handwritten c implementation would most likely be better, but there is so much to gain from just slaughtering the abstraction bloat it does not really matter.