Show HN: Building a web server in assembly to give my life (a lack of) meaning
4 days ago (github.com)
This is ymawky, a static file web server for MacOS written entirely in ARM64 assembly. It supports GET, PUT, DELETE, HEAD, and OPTIONS requests, and supports Range: bytes=X-Y headers (which allows scrubbing for video streaming). It decodes percent-encoded URLs, strictly enforces docroot, serves custom error pages for any HTTP error response, supports directory listing, and has (some) mitigations against slowloris-like attacks.
I’ve also written a more detailed writeup here: https://imtomt.github.io/ymawky/
If you actually start writing big stuff in assembly, esp a macro-assembler, you'd quickly realize it is more verbose, but not fundamentally that different from higher level programming. You basically need to get a hang of how to build abstractions with procedures and macros and you'd be good to go. Reading assembly effectively is often much harder than writing it.
Yeah, that's what I realized during this, too. You need to be much more explicit, but the way any given function works isn't fundamentally different. "strlen" will always iterate through a string searching for a NULL byte whether it's in C, Rust, Assembly, or whatever other language. I think it can feel almost more straightforward than other languages, since you're laying out exactly what the CPU needs to do, in what exact order.
> "strlen" will always iterate through a string searching for a NULL byte whether it's in C, Rust, Assembly
Not all languages use NULL terminated strings. I think Rust actually stores the string length alongside a pointer to the start of the string data. You can do the same in C, but you'd have to do it manually using a struct. In assembly you could do the same thing since you get to decide basically everything.
https://www.youtube.com/watch?v=y8PLpDgZc0E
4 replies →
Agreed. And super cool project. After seeing Matt Godbolts Advent of Compiler Optimisations in December I decided to do AoC in assembly. Was the most fun I had in years even though I didn't finish all days!
And super educational. Since then I've been pondering which problems require dropping down to the assembly level. E.g. implementing a JIT compiler, a coroutine runtime, etc.
It's a beautiful project, well crafted. To reflect to the other comments, projects like this are more like a Minecraft map for me. There are giant and amazing maps, small survival maps, local hosted for my friends and myself, and commercial focused high scale servers. Building a house, or designing a new road in the server became extremely easy with AI, put the value created in the world depends on the original purpose of the server and whether creating more houses and roads actually makes sense. I think it's a super thing that commercial server can build out faster and be bigger with more houses and roads on it, but The love an art project creates in the world is incomparable.
Gave me a warm feeling to know that someone would actually still bother to do this by hand. I'm not the only one!
Thank you! I've been obsessed with this idea for a while, finally decided to start on it, then obsessed over it for a couple weeks. I'd love to see some of your projects if you have anything similar, I'm glad I'm not the only one too! I think most programmers would benefit a lot from taking a few weeks or months to try and learn some assembly, and demystify how CPUs and compiled languages work.
What I love most about your project is the super clean Makefile (https://github.com/imtomt/ymawky/blob/main/Makefile). It is a great template to jump start an ARM64 assembly project.
I made a self documented (all the related links are in the file) single makefile (https://gist.github.com/ontouchstart/d3ad8e4d0adf63532303a90...) so that anyone can build ymawky from scratch and dig deeper by tinkering.
The reason I wrote that compact makefile is that I am at the beginning of a long journey to study the coding style of ngnk (https://news.ycombinator.com/user?id=ngnk).
I can see the potential of doing some implementation of APL/J/K at levels even lower than C, like how those guys did APL\360 using assembly language. It is going to be super fun in the era of everyone using LLM to pump out verbose Python/TS/Rust code with context windows bigger than the whole operating system.
That fake O'Reilly book cover is pure gold.
That book is exactly what inspired me to make this in the first place, haha. The subtitle of the book gave me the acronym I named it.
I frequently reference this exact meme to people whenever someone complains about complicated or difficult-to-write code. Now that’s you’ve made this project I suppose we have zero room to complain anymore!
Fauxreilly!
Even though it's a meaningless comparison, I'd be interested to see how performance compares (max requests per second?) for this compared to fully-featured web servers.
Honestly haven't benchmarked it, but I would imagine ymawky would be considerably slower than most fully-featured web servers. ymawky uses fork-per-connection, which is fundamentally slower than what production servers like nginx or Apache use. nginx uses event-driven IO (kqueue/epoll), which can handle thousands of concurrent connections without the overhead of forking the process on each request. Apache uses pools of threads which handle multiple connections without needing to be spawned per-request. A head-to-head against any other web server would mostly measure "fork-per-connection vs event loop/thread pools", which assembly has nothing to do with.
In a comparison between a similar fork-per-connection server written in C and this, I would imagine the throughput would be about the same, because the bottleneck in this model is fork() itself rather than the actual code. It probably matters more for binary size and startup time than requests/sec. Would be fun to actually benchmark, though.
Should i ask my Claude to benchmark it against nginx or will you ask yours
I am attempting to write a software renderer in WebAssembly because, for some reason, I feel the need to go against the direction this vibe coded world is going, and I want to feel challenged again. I don't know if I will ever finish it, it is crazy, and by no means useful. But gosh it feels so good.
Congratulations to the OP for the accomplishment.
Please post your progress! That sounds cool as hell
Thank you! I will keep working on it and post something here
I did exactly the same and it was so much fun. It wasn't about bringing anything novel to the table, it was just a fun challenge for myself. I finished and now I'm writing a game using it, although now the challenge has gone I am not making much progress on that. But never mind, I had fun! I wouldn't have had that fun or satisfaction if I had vibe coded it instead.
As in 3D software renderer? I cut my teeth on those throughout my teens and the start of my professional career, in x86 and C.
I wanted to see how an LLM would do writing one in pure 8088 assembler for CGA and it one-shot a nice demo (I fed it the vectors for the Elite ship in the prompt):
https://imgur.com/a/Dy5rUku
Yes, exactly, a 3D software renderer. But the goal is to do (almost) everything from scratch and by hand. No LLMs, no std library, no compilers. Just a few imported math functions (such as sin and cos). Not the same as bare metal programming but close
3 replies →
How well does that run on real hardware (or PCem/86box)?
2 replies →
Well done. Been working on a similar smaller project for RISC-V. This is excellent
That's so cool! I would love to see it if you're sharing it anywhere.
It's a HTML browser for Pi Pico2, CLI, meant to support my in-house project running on a mesh of Pico2's. I really wanted to use RISC-V and it needed a webhook that serves a page on PIO wake. We are at the browser is written about 60%. The server is now already handled ;P I found this awesome project someone posted on HN. When I complete my project the browser will be released alongside. You can very likely reproduce it with less than a handful of prompts. One thing I really do believe, ideas are going to be the next open-source. LLM's can make ideas into things.
This is cursed and wonderful. I especially appreciate status code 418. I hope I run into that in the wild one day, then I'll think of you!
We are moving to AI and stopped writing code / scratching our heads, and you're here writing a web server in assembly.
Humbling.
Yeah, humbling - I know which path I prefer
Who's "we"? I have too much self-respect to turn in crappy clanker-generated code rather than doing a good job myself.
This is fun, thanks for sharing. I have a much more minimalist one for x86 Linux if you want to see what that looks like: https://github.com/jcalvinowens/asmhttpd
Oh wow! Your project was actually really inspiring to me, thank you for making this. I was really impressed reading through it a while back. Is it alright if I add a link to your repo in my README?
How fun! I'm glad it was interesting, please feel free to link to it :)
Hmmmm.
One of my first assembly projects was a CGI Script 100% in x86 assembly.
A full web server is certainly more impressive! Though I'd recommend to beginners to look up CGI and mod_cgi in Apache first lol
Woah! I honestly feel more intimidated writing a CGI script in assembly than I was writing a server, lol. CGI support has been on my mind for a couple weeks, but I haven't really dug into it yet. I'd love to see yours if it's hosted anywhere! Could be a great reference when I do.
Really? It's a bit of a nonsense that I did so long ago so it's weird to hear someone interested in it...
The script has been lost to time. I wrote it 5+ computers ago and I don't even know where input that backup...
The overall gist is that CGI Bin specification sets Environmental variables, STDIN and STDOUT to various values. A minimal pure assembly that writes <h1> Hello World </h1> over stdout is your minimalist CGI Script.
A bit of research into what those STDIN/Environmental variables is needed for more. I knew this may e 20+ years ago but have long forgotten....
With access to the various input parameters offered over CGI, you can easily access form data (buttons and whatever clicked by the user). Use some smart file writing to store sessions and off you go....
-------
Maybe start with a Perl CGI tutorial. Then go backwards to C, and finally raw assembly by hand
3 replies →
This is brilliant, I love it.
The absolute best projects ever posted here are the ones made for no other reason than “just because I wanted to”.
Thank you. My favorite reasons for projects are "just because I wanted to" and "because someone told me I shouldn't", and this one ticks both boxes, haha.
With the bubble of LLMs, these projects are really appreciated. Keep up the good work!
P.S.: I would love a copy of that book please!
Love the honesty in the title. I'm 49, learned to code two years ago after 30 years as a carpenter, and "to give my life meaning" hits closer than most career ladders ever did.
Bookmarking this — going to read the source on my Sunday off.
I'm wanting to read this repository as a learning tool, so it'd also be nice to include docs—even AI-generated docs, but obvious I'd prefer docs with your own design notes and decisions—about the architecture of the code.
Really cool project though!
Thanks, I appreciate it a lot! I tried to comment my code pretty heavily (~3000 lines of code, ~1000 lines of comments all together), since this was a learning project for myself in the first place. Hopefully those will be of some use. But separate in-depth documentation is definitely a good idea, I'll work on adding that. In the meantime I'm always down to answer any questions about it!
My first question would be where should I start reading? It seems like you modularized it into multiple assembly files (how does that even work?)
1 reply →
If you'd be happy with that then you can generate them yourself!
Syscalls on macOS aren't guaranteed to be stable - Go found out the hard way and in 1.12 they changed to call libSystem.dylib instead.
In general, stable syscall numbers are just a Linux thing. Everyone else uses blessed system libraries
Yeah, I know MacOS syscalls aren't stable. Interesting point about Go, I hadn't heard about that. Unfortunately I'm a masochist though, and want to avoid libSystem.dylib as much as possible. The only reason I link against it at all is because MacOS requires it for executables to run, I never actually call into it. Figured I'd just update the syscall numbers if/when they change.
You could also do a trick some Windows stuff does - parse syscall indices from said dylib.
What was the hardest part, Range header handling or the slowloris mitigations? Both seem like they'd be a nightmare without higher-level state machines.
slowloris mititgations actually weren't too bad, just a couple syscalls to setitimer(), sigaction(), and setsockopt(). Range header parsing was awful, so was content length. I'd say all in all, probably Range headers. Just string parsing in general is pretty awful in assembly.
Here's a piece on writing portable ARM64 assembly: https://ariadne.space/2023/04/12/writing-portable-arm-assemb...
Thanks for the link, bookmarking. I should note ymawky's main portability issues are unfortunately at the syscall layer rather than the asm layer. proc_info() and getdirentries64() are pretty Darwin-specific, so making it portable would require reworking that whole area rather than adjusting register/calling conventions.
I don’t know why, but this project has me irrationally excited!
Your determination to make this happen was remarkable — and you truly accomplished it. Congratulations
I feel the guy’s suspicion towards any high level language. I exclusively programmed in assembly on C64, Amiga and the recognized that this ain’t sustainable on PC because there are more and more edge cases or different machine configurations.
I had a very hard time simply using and even utilizing C++ or Java.
C and Turbo Pascal especially was easier because the compiled code was very much resembling to hand written code.
As the author described, you can do in 4.000 lines what others can do with way less pain in 100.
So you build macros, come up with your own library and in the end you kind of build a meta language build on top of assembly because some lines are so hard to grasp that you delegate working code into a library for reuse.
It is funny how much we take conventions for numbers for granted. If you happen to know assembly and its intricacies you immediately will learn to work with a sign bits which mark negative numbers. But how do you know? Maybe you use the whole addressable space only for positive numbers.
Small things that make a huge different.
Nice article, I enjoyed your adventures and would do the same.
Thank you! The thing about eventually building your own meta language ends up happening all the time with bigger assembly projects. I do have a fair few quality-of-life macros too, but probably fewer than I should. I did end up needing to implement by hand what would be standard functions, things like atoi, itoa, strlen, memcpy, streqn.
Higher level languages are more convenient for 99% of things, but the directness of Assembly gives me a rush unlike any other. I didn't live through the C64/Amiga, but I was obsessed with old C64/ZX emulators growing up.
I don't know. Certainly the PC had a lot of options, but it wasn't impossible. My first piece of commercial software was written entirely in x86 assembler and had to navigate things like graphics card options and multiple sound card options. It could be done, it was just a lot more of a PITA.
Once I was doing 3D I quickly started moving everything but the inner loops to Turbo C, because I'm not a total masochist :)
Even after we've all retired (pretty soon for those who can afford it) or transitioned out of software engineering (for those who can't), we'll still get to amuse each other with home-brew projects like this. Warm fuzzy feeling - I'll take it!
Thank you! This is one of the nicest things I've heard in a while.
I suspect that the test suite isn't great. Bun has so many different behaviors compared to other JS engines, sometimes just plain wrong or contradicting the spec. Test suite didnt catch those..
Wrong thread?
This is awesome! I'll have to try reading through the code when I have more time.
It would be awesome to read a blog post about the project. Your approach, lessons learned, unexpected stuff, etc.
Awesome. Any resource recommendations to learn ARM assembly?
Honestly, just reading existing assembly to get a feel for how it works, and then violently googling everything that goes wrong. The ARM Architecture Reference Manual (aka "The ARM ARM") ended up being really helpful for looking up what specific instructions do and how they're called. Another really helpful tool is writing something in C/C++, and compiling with "gcc -O1 -S file.c" to see the assembly gcc generated. It helps to mess around a lot with smaller programs in gdb or lldb.
[dead]
Didn't Steve Gibson do this like 25 years ago? AFAIK his "Shields Up" site is written in Win32 assembly.
Then it is unlike this, as this is written in arm64, not x86, and not for Win32.
This is amazing, great work! I love it!
Need a straight binary port now
Why stop there? Next, I'm prying open a CPU and poking the transistors with a 9V battery and paperclips to make it execute what I want. Slower, but you get so much control.
If it is written in assembly, why is it for MacOS only?
Assembly for the correct architecture is only one part of getting an executable running on a machine.
- Dynamic libraries (e.g. for calling into the kernel, but also user space dynamic libraries) are OS-specific (.so for Linux, .dylib for macOS, .dll for Windows)
- Executable format is OS-specific (ELF for Linux, Mach-O for macOS, PE for Windows)
- Dynamic loading and linkage of both the above are also therefore OS-specific
And even if you avoid external libraries, you still need to interact with the kernel to do I/O, and that involves system calls that are also non-portable.
The hero we needed
The comment about LLMs writing this is interesting but I think it misses the point. Someone still had to know what to build and why. The assembly knowledge needed to even prompt an LLM correctly for something like this is not trivial. The craft shifted, it did not disappear.
Love this so much.
This looks genuinely great. Well done.
Insane
Its true
Where's your SKILLS.md? How did your agents make this?
jk. Metal as fuck. Love it.
Ahh you caught me. I just kept telling ChatGPT dot com "no, make it less efficient" and copied whatever output it gave me. jk, thank you!
do you know about rwasa?
https://2ton.com.au/rwasa/
...
interesting!
This is a great resource, thank you!
The last time I did anything in assembler was x86 under DOS. Your code makes ARM64 with a modern OS less scary than I thought it would be.
Arm is very nice to write assembly for. Having a proper load/store register-centric architecture rather than a stack-centric like x86 makes the mental load of writing code go waaay down, so the attractiveness of HLLs for ease of writing code is greatly diminished on RISC.
Hell no. Far too many registers, not enough instructions, and (especially with ARM64) weird restrictions that arose from trying to pack things into 32-bit instructions as efficiently as possible.
I've been writing x86 Asm for a few decades. RISCs are simpler in all the wrong ways. After all, "just use a (stupid) compiler" was the whole philosophy.
1 reply →
This is fucking nuts
I'm oddly enthusiastic about seeing someone who beings the HACKER in HackerNews. But at the same time, this made me remember the days when display of skill and craftsmanship were rewarded in the industry.
Maybe it's finally time to move on from being a career programmer.
What a dismissive comment. Now that anyone can have an LLM write code for them, the only people who have value to bring to a project are the ones who can improve upon the LLM's output. That is, the ones who have a deep enough understanding of the logic and language. And the only people who will ever be in that position are the ones who take the time and effort, out of sheer curiosity, to learn how things work. Whatever your alternative is to this, there is no future in the alternative.
Artisanal code has a future. Maybe not a high paid one but maybe we go back to roots. if you enjoy programming and were never focused on output or on pipelines, LLM doesn't offer the same ezperience
3 replies →
I don't see anything dismissive here. It is a realistic assessment: if the choice is between code generated by AI or code generated by a human, and the AI is better in an objective manner, then why should a company employ a human? I refer here solely to the code result; naturally humans may do things AI can not do yet, but if the question is solely about code quality and AIs are better here, then why would that comment be dismissive rather than realistic?
> And the only people who will ever be in that position are the ones who take the time and effort, out of sheer curiosity, to learn how things work.
People learn something new all the time, AI does not learn anything, it just simulates and hallucinates. But the core question is not addressed with that. What would you do if you have to compete against AI, and AI is better? We already see these with the new generation of humanoid robots from China. Those things make Boston Dynamics robots look like tinker-toys in comparison - already as-is. Give it ten more years and we finally reached AI skynet for real.
1 reply →
I don't see it as dismissive, maybe you two are talking past each other but seem to be on similar side. I think the parent just articulated a sense of resignation that many people probably share. I think you might be saying that maybe there is still some shred to hold onto, possibly.
Don’t be naive. Anthropics (et al) mission is to make us unemployable. They need to sell their tools to companies so that they can finally discard 90% of their workforce. It’s a win win for companies and for anthropic (et al). Obviously we are the losers in the middle. And people around here on HN may think they cannot be affected, that they are the elite class of developers… they are gonna get hurt
success through perseverance and toughness - fucked by AI
success through cleverness and inventiveness - not yet fucked by AI
achievement through stubborn persistence - you can still dig deep holes in the garden
you still could have a character, if you were lucky
human agency? not yet fucked up, but it's gonna be
achievement earned through one's own qualities or effort? - intact somewhat
I've used Python (django/flask/fast api), Java (springboot), Ruby on Rails for writing web applications and APIs.
Nothing beats Go.
When you use HTMLX (goat) + sqlc (goat) + pgx (another goat) + Chi (yet another goat) and Sqlite (goat).
Most apps will not need anything more than Sqlite, i've several sqlite apps doing a couple of million visits per day.
Compiles to signal binary blazingly fast.
Deploy using systemd service, capture logs with alloy / Loki graphana setup, set up alerts and monitoring and go home.
And you can serve millions of requests on a server with 512MB RAM.
I don't think you'd ever need more speed than this.
Everything else is bloated, slow and doesn't give you enough room for optimization.
Here's the latency of one of my hobby projects (network latency not included): https://i.ibb.co/hJ6FQtyw/d3d6c9d15765.png
Request rate: https://i.ibb.co/Fq80nfJ4/67fcdbdb7491.png
It's running in US and EU (helps avoid atlantic routrip tax), in this one i am doing some 100s of checks, not simple CRUD work. With Go you can optimize a lot without complexity of Rust.
Can you share what some of those apps are?
I’ve written all of these languages and more professionally. I agree none match the speed and simplicity of golang. Go is that efficient.
How are you merging sqlc and pgx with sqlite ?
Specifically how can you use pgx with sqlite while pgx is a postgres-specific library? Sqlc works great with Postgres or Sqlite, Sqlc works with pgx when connecting to Postgres, but pgx can't be used with Sqlite AFAIK
Did you reply to the wrong submission?
They replied to the title.
What's your point?
You don't need to use assembly for high performance web app when you can just use Go.
2 replies →
This post seems to now link to the writeup rather than the repository, sorry! The repo can be found at the top of that page, or directly here: https://github.com/imtomt/ymawky
Whoops that was my fault. Fixed now. (I emailed you, btw, that we'd changed your title, but I forgot to switch the URL back to the repo. Both links are cool.)
I'm sure I'm not the only one who has fantasized about doing something like this as a self-soothing enterprise. Kudos to you for actually doing it!
Hey, thank you! Means a lot. It's an odd sort of meditation, but is surprisingly the most almost-therapeutic project I've worked on. Something about the constraints of Assembly that really pull you into the minutiae and clears your head, maybe.
Ten years ago, I would have kowtowed to someone elite enough to build something like this.
Today, I just think, "how long would LLMs have taken to write this?"
I mourn the death of a human artform.
It's far more exciting than sad.
Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way.
Look to the positive instead of lamenting something that never would have happened.
It's unbelievably exciting that you can now program a computer virtually without the limitation of your ability to hand code it.
The result is unimpressive either way -- it's the journey that is exciting for these kinds of projects
9 replies →
You won’t be able to enjoy your free time playing with computers if anthropic et al make you jobless.
The “you” doesn’t necessarily refer to you. Im addressing 90% of the developers out there. We love playing around technology… but I doubt we will be thinking the same once we become unemployable. But here we are, having fun with the tools of companies that want to finish us. How ironic
3 replies →
> Got an idea that you'd need assembly language for - now you can do it instead of.....
Nobody actually needs a web server built in assembly language, it serves no practical purpose. And I say that as someone who learned to program 6502 assembly language in 1983 and has sporadically used assembly of various architectures since.
The absurdity of building it would have been the curiosity draw pre-LLMs, but when it existing is just a series of prompts away it really loses all of its meaning.
But yeah... hooray for AI. Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.
2 replies →
It has always been possible to do it. LLMs are not a particular enabler for that.
The difference is that now it is worthless: there is no learning, no person caring about the result, nothing aspirational for the public to look towards... we used to enjoy those challenges, used to be proud of solving complex problems... now? Yeah, whatever, execute execute commit push, let another LLM "review" and call it a day.
17 replies →
If you've got an idea that you need assembly language for, you can use a compiler to create that assembly language. It'll probably do a better job than an LLM. Assembly projects are interesting because they're written in assembly, not because they contain assembly.
3 replies →
This idea that LLMs are going to enable us to do things that we wouldn't have done before, therefore overall productivity and value is going to increase "exponentially" seems naive to basic economics.
If LLMs are good for doing things we aren't already doing, it indicates the overall addressable "value" that LLMs could provide for such things is actually quite low. If the task has necessary prerequisites that you don't currently possess, but you haven't spent the effort to jump that hurdle yet, then it's a good indication the value of completing that task is very low. Even if, maybe especially if, we're talking about personal projects where the value proposition is personal and not momentary, it indicates the person already feels in their bones that the return on doing this thing is not worth the effort.
I'm struggling with this with my leadership at work. We have developed a thing that is going to remove the need to hire temps [0] for data entry when we get clients who send us large amounts of their "data," aka "a thousand 30-slide PowerPoints each with one line graph of interest sitting in the corner of one slide somewhere." It is an ask that comes up a lot, it's always very expensive in both the time and money axes for the client for that task, but overall it's just a small part of the contact budget. I'm all for using what we've built to cut down the time cost for our clients, but my leadership thinking it's going to lead to massive cost savings for our clients seem to forget just how much time we spend in meetings and planning and documentation and testing and reevaluation and more meetings versus actually executing on things.
It's also bad business. To me, giving results faster should be a premium offering. We should be charging more, not less.
[0] We don't actually hire temps, we turn our junior data analyst into temps by burning them out with tons of unpaid overtime. They then leave and we have to back fill them at rather extreme cost of overhead for hiring compared to the direct contract overtime we didn't provide.
I think for programmers the enjoy is to write it by his own, not to just have a toy. If I just want a web server in asm the easiest is to just decompile an existing one into assembly and call it a day.
Only exciting if you already got a lot of programming under your belt, like Carmack, or a product guy.
> without the limitation of your ability to hand code it.
Isn't that kind of view pathetic and sad, though? Why would anyone pick up and guitar or play a piano if they could just listen to the same song already made by someone else? I struggle to understand this view of people that pretend to not understand why being an expert of some skill is perceived as valuable by some people. This is also belies next problem with this line of thinking which is that it says "we don't need to learn X to do Y because we have AI" but misses the same AI could easily replace the need to have you think to do Y in the first place. I don't know.
4 replies →
> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing
But you're not doing it. The ai is doing it.
If the op can write a web server in assembly language then I'm pretty sure they could have done it in a higher-level language. But they did what they did for the journey and the learning along the way. Vibe coding it omits all that, and misses the point of the exercise.
I do believe this is just a next step in languages. We've come this far trying to make code NLP, now we have the closest thing to a translator in our generation. It's an exciting time, just don't pay attention to talking heads.
the biggest issue with llms is that they make those who have no idea what they are doing seem like they know something.
>> without the limitation of your ability to hand code it.
yeah its nice though this in 100% of cases results of software of even lower quality we had before.
so hard to tell where is the win here. the fact that you can generate some code does not make it a win, just a curious fact.
Which is why now companies can happily reduce head count.
> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way
If you are having an LLM generate the assembly language for you, that is not even remotely close to writing the assembly language yourself.
I don't find it exciting even in the slightest. I can think of nothing more boring and unsatisfying than having an LLM generate all of your code for you.
I mean, I understand why some think this could be exciting from a "I can get something done fast because the LLM generates it for me" standpoint -- because their excitement stems from something getting done at all instead of just sitting in the pool of ideas forever. However, you will never know the code generated by an LLM like you know the code you wrote yourself. Also you will never gain the same satisfaction of finishing a project where the code was written by an LLM that you gain from finishing a project where you wrote the code yourself.
If you are a person that doesn't care about coding or doesn't like to code at all, I could totally see why you'd find this exciting - to you it's all about avoiding work you don't care for or want to do yourself anyway. Also, a high percentage of people who do love coding have zero interest in writing assembly language, so if they were required to write some for a project, I could also see them being happy with having an LLM generate that part of the project for them.
However, I think for people who genuinely love to write code, the situation is the opposite of what you said -- it is far more sad than it is exciting. In fact, for many of them it has already reached the point of depressing for many reasons. I don't think it is primarily because the LLMs have gotten significantly better at generating code (which they have). I think some of the bigger reasons are that so many people who now pay people to produce code have:
1) got a very short-sighted and "rose-colored-glasses" view of what LLM-produced code will do for their company.
2) deeply under-appreciate the value of having a person or team of persons who understand their business, the hardware and software required to support their business, and the work required to both keep things running and handle new requirements as they come along. Because of that under-appreciation, many already have punted ( and/or are preparing to punt) those people to the curb because they think they can just have an LLM do their job and save a ton of money.
In the long run I think most (if not close to all) of those businesses are going to be sorry if they over-indulge in replacing human-produced code with LLM-produced code. I think the ones who lean too heavy on the LLM side are going to eventually collapse into a heap of unmanagable dumpster-fire code that they can't understand nor maintain. A whole new world of incidental complexity will consume every project, and in the long run it will just eat them alive (figuratively speaking, of course :-D ).
I think that the analogy of recorded music best captures your feeling. Not the exact technological and economic transformation that is happening, but the feeling.
Some 120 years ago recordings music was a living phenomena produced in the moment. Musicians worked at restaurants and coffee shops everywhere, being useful without being super stars.
Music didn’t disappear with recordings, but the works is certainly different.
The analogy is AI music produced without effort in seconds, not recorded music.
It doesn't diminish the art form though. If anything, I value these kinds of hand written projects even more now that so many people are pulled in by AI doing their projects for them in a fraction of the time and effort. I love doing these kinds of projects, and I love writing assembly, but I must admit that the temptation of just copy pasting generated code is big sometimes, because it's _right there_. In this context, seeing someone handwriting something awesome by hand is even more valuable to me.
I think the parent's point is: a couple of years ago handwriting was the only option. You'd see a post like this and know it was something special.
With LLMs, we can't tell anymore if something is a labor of love with hundreds of hours of work behind it, or half a dozen prompts to Claude Code.
1 reply →
The answer is "no time at all." I used Gemini Ultra earlier this year to see how well it would do with some really gnarly assembler. I asked it to write a whole flat-shaded 3D engine in 8086 assembler that would run in CGA on an original XT and it one-shotted it in a couple of minutes.
https://imgur.com/a/Dy5rUku
Look at the bright side, it's much more feasible now :)
Yes, it's not deterministic, and if you were using it commercially, the ROI would be terrible, and it's certainly not reliable but for a hobby project.... why not?
Encouraging people to understand the layer of abstractions they're building on is helpful, doesn't matter if they do it by hand or with clankers.
LLMs lower the barrier for execution - they make you faster. The unstated question is: faster at what - they can make you faster at something clever, or faster at the entirely wrong thing...
Your point is correct if we're looking at it through a scarcity lens - the effort to make it certainly decreased a lot - but that doesn't mean that anything is now worthless. We can just move onto doing bigger, better things now, until we hit the next limits...
> Encouraging people to understand the layer of abstractions they're building on is helpful, doesn't matter if they do it by hand or with clankers.
But we know that long term use of LLMs does not lead to better understanding, it leads to reliance on the LLM for the person to be able to function at their job.
2 replies →
> I mourn the death of a human artform
Well, look at this way, the needs of commerce are going to solve the conflict between practical and the beautiful. I think those of that value the beautiful aspects of coding will find new avenues of expression. For example, I'm about to get back into C programming to build a play.date game engine for an MMO.
Haven't used LLMs for assembly yet, I did try to use it on some DSLs with few docs, the results were much less impressive than those with popular, higher level languages AI companies scraped a gazillion repos for.
It was an artform and a necessity. It's even more of an artform now.
I appreciate the effort. But why not put the skills into something that would be genuinely useful to others or solve a pain point? Open source the results or handover to someone who would like to maintain it.
3 replies →
I used to LOVE getting into a good flow state while programming and was very proud of clever code I made. Now I just think how much faster an LLM could have generated the code.
> Ten years ago, I would have kowtowed to someone elite enough to build something like this.
I'm afraid it's an elite skill in the sense that juggling is also an elite skill. It's impressive for the first few seconds you gaze into it, but once the novelty factor wears off you understand that it's wasted effort that leads to a project that suffers from a massive maintainability problem, is limited in which platforms it can run, and brings no advantage whatsoever. It's an gimmick that has no practice use.
This is the software development equivalent of an amateur guitarist posting shredding videos on YouTube.
What an odd take. It is often titled "software craftsmanship". Is the craftsman not allowed to practice? Not everything needs an immediate real-world application. Not everything needs to be enterprise-grade, bulletproof, web-scale or whatever. It needs to work for the creator, and sometimes not even that.
In the same way we appreciate Japanese wood joinery, why not not just appreciate this? Someone might even learn a trick or two reading it.
3 replies →
But if it was written with an LLM, then it's not really written in assembly.
Would we have considered writing a server in C writing it in assembly? No.
You can say that with manual reveal of photographs, LP collections, a text written with a typewriter.
Is not the end of the world. Is a change.
It's not just art though. It's human thinking and problem solving and learning that is dying.
I have a (relatively well informed) view that people who aledge that "if you used an llm you did no thinking or problem solving" have never in fact used an a llm to generate anything particular complex.
You do indeed need to do quite a bit of thinking and problem solving, to build things with an llm.
If you disagree, repeat this project, so you can share with us how little thinking it required.
1 reply →
So what art form can a human make with an LLM assisting?
I get what you mean but I feel this new profound yearning for "hand-crafted" code is getting a bit out of hand. Software engineers have taken shortcuts whenever possible since software was a thing. Do you also mourn that we don't code airplanes by hand anymore (i.e. the death of the "craft of coding").
We need to stop thinking of software as carpenters where the magic is some physical skill and that is the "CRAFT WE MUST PROTECCT".
And at least your comment was grounded in reality; a lot of people I talk to (who are not coders) seem to think a good software engineer writes every line and every word with thoughtful genius and AI just spams code so one is better than the other. And they are convinced its some naunced smart take and they understand software development on a inner level or whatever.
And the base assumption still holds true (pure AI-generated code is garbage) but its mostly because its badly designed and is still a pretty poor architect. And there is a need to pushback against slop but why do we need to elevate typing code as if its some sacred acctivity? Most of the work a good coder does is in their mind with little connection to the phyiscal reality of the world.
In this case you could even replace "LLM" with "C compiler" and it would change nothing.
Look, I still got my physical copy of Michael Abrash's Graphics Programming Black Book with its genius content about hand-optimizing cycle count on 486 and Pentium processors, beating compilers at that time.
It was an absolute artform, but completely obsolete by today.
> I mourn the death of a human artform.
The artform only dies if you let it. Even if your employer is so idiotically myopic as to forbid you to ever write your own code, you can still continue the art on your own time. I for one don't care how "good enough" any AI-lableled technology gets at writing code. I will continue to hone my craft until the day I either die, become too unwell to do it, or some other creative endeavor consumes all of my personal time.
Human artform is still alive and well as evidenced by this post.
Yes, an LLM can write it, it’ll probably work. Yet, it’ll remain meaningless slop while this is not.
The problem, at least for me, is that by now I'm so desensitized that I won't even bother looking at something, because it could potentially be the product of a few prompts. The LLM noise is drowning out the human signal, so to speak. Same for articles, blog posts, etc. It only takes a few em-dashes, a few "it's not this, it's that" to lose faith in the text's authenticity, and with that, any interest in its content.
1 reply →
[flagged]
[dead]
[dead]
[flagged]
[dead]
[dead]
[dead]
[dead]
[flagged]
[dead]
Your determination to make this happen was remarkable — and you truly accomplished it. Congratulations
An agentic LLM should be pretty good at Arm64 assembly generation, but maintainability of large code could become an issue. Why would it not run on Linux?
I wrote it for MacOS because I don't have a Linux machine right now :( Once I get one up and running again, I'll probably work on porting this.
As for why it wouldn't run on Linux, there are some pretty big differences in the actual assembly. One pretty superficial difference is calling conventions -- MacOS uses the x16 register for syscall numbers, Linux uses x8. Calling the kernel in Mac uses "svc #0x80", in Linux it's "svc #0". That's ~120 lines that need to be replaced, but easy enough to just use sed. Syscall numbers are all different, as are the struct layouts for sigaction(), MacOS has an "sa_tramp" field that Linux doesn't have. Enforcing max processes is done here using the MacOS-specific proc_info() syscall, which can be used to get the number of children any given process has. Linux doesn't have an equivalent, so process tracking would need to be done differently. Finally, Linux has the getdents64() syscall, rather than getdirentries64(), which uses a different struct and is called differently.
I'm sure an LLM could make all those changes, but it's a pretty large codebase, so it would probably make some mistakes or miss things.
Is it not possible to maintain a single codebase for both OS with appropriate .ifdef commands?
1 reply →
You could always start in a virtual environment.
The first paragraph of the README says this was hand written so I’m not sure why you’re bringing up LLMs
Because it's absurd for most people to write too much Assembly by hand.
7 replies →
Get a better f* meaning to life, will you?
Example: spend time looking at health/nutrition. I assure you that in 5 years you will get more satisfaction and returns than the assembly code you wrote.
One important caveat - the subject of health/nutrition is SIGNIFICANTLY more complicated than assembly code, and not may sources out there know what they are talking about. Computers are child's play compared to biology.
You good bro?