← Back to context

Comment by vaxman

10 months ago

Linux on new Macs is pretty toxic except as an intellectual or academic exercise. With the industry transition to AI-as-a-platform, the hardware that these Asahi devs are working on (m1..m5) has no real preservation value (even with 128GB RAM --look at the crazy bad TOPS and existing macOS VM capabilities).

If one looks at Project DIGITS, a game changer for Apple and Microsoft, that system boots Linux and can then run any number of knock offs of the Mac/MSWindows style GUI (which is all obsolete now --all that's needed is a simple webUI and AI speech/video interface), nVidia/MediaTek didn't have to spend any time/money (other than BSP/drivers and DevOps) to get the System up and running. In olden days, that could have caused a hardware competitor to carefully consider the future of maintaining a legacy and proprietary OS, but here in the future, Apple has trillions of dollars and still too much pride (until its execs retire Real Soon Now) to allow a competitive effort to MacOS to rise up (maybe it will live on as a sort of Mac-esq Dex®-type environment for iPhone/iPad/XR someday).

Would like to see the third party engineering effort go instead towards supporting nVidia and CUDA on the new Macs via Thunderbolt.

I don't see how AI has anything to do with it. I have been running asahi Linux for more then a year with less issues then running a Dell XPS before that. I don't think calling it an intellectual exercise is fair here. These MacBooks are better Linux devices then a lot of laptops. With some hardware support still missing currently.

  • Huh. Well, maybe look at the "forest through the trees." You say 'Linux is great on these laptops, blah blah blah' and can't disagree, but point was that unlike those that came before, these particular laptops are dead meat in under 48 months, unworthy of the monumental effort to port and maintain Linux (or anything else) on them. Why spend valuable heartbeats porting and maintaining LINUX on a class of devices that are DEADER THAN A DOORNAIL? (Why not write a better 6502 emulator instead? /s)

    That is to say, all GOOD data analysts, accountants, software engineers, lawyers, musicians, writers, editors, animators, illustrators, systems people, etc... will accomplish their jobs using multi-modal LLM-like interfaces with supporting application-specific I/O devices, or they will be blown away by those that do. They don't need/want Mac/Windows/Linux to accomplish this --it will start out as an app that they click on in those systems and eventually they will be able to buy machines that don't even have that old user experience anymore. Even the people creating the "agentic applications" for these users will themselves use agents to produce the most efficient systems possible, so they don't need Mac/Windows/Linux either (at least not in their currently recognizable forms).

    It will all be clearer as Project DIGITS (and Apple's response to it) get into user hands in a few weeks/months along with the avalanche of agents that are coming (from Salesforce's pivot on down to productivity apps --heck there are already Youtube videos on how to create 3D models in Blender using AI, ultimately apps like Blender will turn themselves into agents, so it will be "help me create a 3D world for use in other agents that has features a,b,c..." and the governing AI will offer some premium agents for doing that, one of which will be functionality that was once found in an ancient desktop app called Blender).

    Again, personally I'd like to see the 31337 SWE skills bringing LINUX up on these obsolete devices go into bringing up CUDA on external Thunderbolt-connected nVidia GPUs to squeeze a bit more life out of these lame duck machines during the transition,

    • Why are they dead meat? It makes no sense to me. They have NPUs that have been reverse engineered for use in linux and basically have strong integrated GPU. I personally doubt local AI is worth at all. All good models require insane hardware. But even if you disagree the M* macbooks are very well equipped to run local AIs. Better then most windows laptops coming out currently.

      3 replies →

Completely disagree. We support ARMv8.x (forget what x is) on Amaon Linix running on Apple Silicon via Asahi. Been doing it for over a year.

  • whoosh

    • ..and after China just slapped you upside the head ($1T moved away in a couple days last week) with DeepSeek (oh man, that name --and it doesn't seem like anyone gets it haha)?

      It sounds like it could help you to know my opinion that cloud-based training with massive capacity and, of course, cloud-based inference the way Apple does it (with PCC), makes a lot of sense; but the idea that centralized OpenAI/Anthropic/Claude style inference engines that would somehow carry out the applications for most of mankind is truly ridiculous. (Hey, Meta got something right for a change --you know, because their hands are on the actual wrenches, unlike the execs at those other companies.) That kind of Cloud AI you see today will die with the imposition of very heavy regulations that are already on their way and those systems will become brokers for more distributed agents to find other agents. This is a good thing because it gives them a way to preserve (rather than cannibalize) their existing business models.