Comment by andkenneth
11 days ago
They recently tried to upstream an improvement to zig, but were prevented from doing so because zig has a hard and fast "no AI code" rule. Whether you think this response is trying to put pressure on zig or whether they're just moving for practical reasons is up to you.
It's probably a bit of both.
I don't see why they think it would work when the reason their patch set was rejected was because it was not correct, did not go in a direction the Zig authors were interested in and is also in an area where they are already working hard on improvements. It would have been much better if the bun team joined forces and helped out instead of vibe coding a broken PoC patch that never can get merged. Compilation speed is one of the current main focuses of Zig and changing the type system to make that possible was a big part of 0.16.
Anyone can hack up a quick PoC, even without LLMs, the hard part is writing code that is correct and maintainable.
Side note, but I think using LLMs like this to write PoCs in existing projects is actually a good idea to prove whatever you had in mind is feasible and worth it to pour time into. Obviously you need to not vibecode the entire thing once you're past that point though...
> It would have been much better if the bun team joined forces and helped out instead of vibe coding a broken PoC patch that never can get merged
Bold of you to assume they have the expertise.
Bun folks routinely contribute to WebKit, and bun itself is an incredibly impressive project, so I don't think they're lacking expertise
I think they do. Building bun is a complex task and engineers who can do that should also be able to figure out how to help out with a compiler. It is just a matter of immersing yourself in the code and be willing to put in the hours and hard work. Sure, they may not be able to help out with designing the type resolution but there is other work which needs to be done that any skilled engineer can do.
> It would have been much better if the bun team joined forces and helped out
Submitting patches is joining forces and helping out.
Submitting patches that are correct and match the project's desired standards¹ is joining forces and helping out.
--------
[1] And align with the project's direction. This part is of course much more subjective so could very easily be an honest misunderstanding of the situation.
Not only because the AI part, here's a discussion [0] about it
[0] https://ziggit.dev/t/bun-s-zig-fork-got-4x-faster-compilatio...
In the context of this post, that's absolutely hilarious they're vibe-porting their Zig codebase to Rust.
I love Rust, but you couldn't pick a language with slower compile times... XD
Compiling Rust is actually quite fast in my experience. The problem with many Rust projects is that they pull in dependencies left, right, and center. Pulling in Tokio makes your project compile an entire thread management system even if you're just compiling Hello World, and simple oneliners containing macros can easily spread out into dozens of lines of code each.
Linking is also slow, and the extreme amounts of metadata produced for LLVM almost serves as a benchmark for LLVM's throughput, but that's all in an effort to produce faster, better binaries in the end.
On godbolt.org, Hello World compiles and runs in about 250ms. Zig's Hello World compiles and runs in 600ms. Of course Zig is still an unfinished language so optimisations like these are probably hardly a priority, but when it comes to lines of code per second, the difference isn't as big as people make it out to be.
What will make the most difference is how many crates the rewrite will pull in. The PORTING.md file specifies "No `tokio`, `rayon`, `hyper`, `async-trait`, `futures`" for the second phase, which should definitely get rid of the excessive compile time many people associate with Rust projects.
8 replies →
That reply was educational indeed. Thanks for sharing.
> but were prevented from doing so because zig has a hard and fast "no AI code" rule
The patch would have been rejected either way because it was out of date and conflicted with other work going on.
Makes me wonder why zig announced the strict LLM rule recently. I'm afraid one reason could be that zig doesn't want to accept code from the bun fork in the first place (because of LLM usage, deviation and other reasons)
One non-obvious reason is that an important aspect of their community is to shepherd new contributors [1]. LLMs crushing everything would reduce that. More obvious is all the toil for maintainers dealing with LLM PRs (broadly it’s an issue). The Zig maintainers prefer to put their energy into improving people and fostering those relationship.
[1] https://kristoff.it/blog/contributor-poker-and-ai/
It's important that developers have an accurate mental model of how things work, are structured and why.
LLMs promote a decoupling of mental models and the actual codebase.
As much as some may want to believe, just reviewing what the LLM outputs is not equivalent to thinking about implementation details, motivations, exactly how and why things are, and how and why they work the way they do, and then writing it yourself. The process itself is what instills that knowledge in you.
1 reply →
Well said! I don't think either party is really at fault here, but if Anthropic wanted to contribute non-negligible amounts of code over time then it's an absolute dealbreaker.
Sucks for people who were invested in contributing to Bun and don't like working with AI tools to be sure, but I think the writing was on the wall for them pretty much immediately post-acquisition. You must admit, it's hard to predict that 100% of source lines will be written by AI if you're not walking the walk!
That's a solid reason to keep LLMs away from the kind of tasks that help with onboarding. But a patch series from a competent team that changes 3000 lines should probably be evaluated on its own merits. Or at least, the collaboration-based reasons to reject AI don't apply and the real reason would be something else.
(Though I don't know if this particular patch series would get accepted on its own merits.)
5 replies →
Yeah, I remember when the lazy bastards started writing programs using compilers instead of learning assembly language. Now I don’t have a single colleague who can write assembly. There’s whole generations now who can’t code assembly. Most don’t even know what a register is. Hope Zig holds against this latest attempt to make everyone stupid.
31 replies →
There are other reasons why a project like Zig might not want to accept LLM generated contributions.
Zig, as programming language, has a multiplier codebase. A bug may affect a significant larger portion of users than most libraries or binaries will, as it's a fundamental building block of everything that uses Zig. Just that could be worth the extra scrutiny on every individual commit.
There's also the usual arguments: copyright ethics, environmental ethics and maintainer burden.
> has a multiplier codebase. A bug may affect a significant larger portion of users than most libraries or binaries will
Couldn't you say exactly the same about bun?
2 replies →
>Makes me wonder why zig announced the strict LLM rule recently.
I guess there are 2 philosophies in software development: move fast and break things and move at a pace that guarantees everything is rock solid.
Most commercial software, Anthropic included is taking the former path, while most infrastructure teams are taking the later.
I guess Linux and FreeBSD kernels are also not accepting LLM based contributions yet.
> I guess Linux and FreeBSD kernels are also not accepting LLM based contributions yet.
Both appear to be[1][2]. FreeBSD doesn't have a formal policy yet, but they appear to be leaning towards admitting some degree of LLM contribution.
[1]: https://docs.kernel.org/process/coding-assistants.html
[2]: https://forums.freebsd.org/threads/will-freebsd-adopt-a-no-a...
> I guess Linux and FreeBSD kernels are also not accepting LLM based contributions yet.
PostgreSQL, a famously slow and rock solid project, accepts LLM-based contributions. But they are held to the same high standard, if you cannot explain the patch you submitted it likely get rejected.
> move fast and break things and move at a pace that guarantees everything is rock solid.
Zig is famous for taking the former path! Anyone using Zig for a few years knows every release breaks things, and they are still making huge changes which I would classify as “moving fast”, like the recent IO changes!
1 reply →
The LLM rule has been a thing for a very long time at this point.
Possibly, but the Zig creator is active on Lobste.rs, where he's been vocally anti-LLM for a year now, so the timing could just be a coincidence.
It's a combination of pragmatism (not wanting to wade through slop, not wanting to shove out newbie developers) and politics (usual contemporary techie progressive stuff that's now oddly anti-technology).
> usual contemporary techie progressive stuff that's now oddly anti-technology
You can be against a particular technology without being "anti-technology".
See DRM/surveillance/bad self driving implementations.
> usual contemporary techie progressive stuff that's now oddly anti-technology
Just because a thing exists doesn’t mean you have to use it for everything. You don’t use asbestos blanket? Why are you so against asbestos?
1 reply →
I like your username.
So if tomorrow Rust denied the "improvement" to upstream Rust then what's the next language they plan to vibe code it in?
Rust is a significantly more mature language. Adoption of zig has to be done on the assumption that the language will significantly improve as your project evolves, and if those improvements don't agree with your project's goals you're in something of a lurch. Rust is basically finished and adopting it has to be done on the assumption it won't change very much. I don't know what their initial logic for adopting zig was, but I think porting to a more mature language was inevitable, unless by some miracle zig happened to rapidly mature in exactly the direction they wanted,
Perl
Raku?
C obviously.
I was hoping bash because why not. It's AI that has to work and maintain anyway and Anthropic employees aren't limited by 5 hour 7 days limits anyway I suppose.
You missed the part were everyone is going to run its own vibe coded assembly tools[1].
So the next step will be that bun will be directly re-written from scratch at every iteration, the repository will only contains the specs for the LLMs.
Caching locally the generated code will be authorized for some transition period, but as it’s obviously very dangerous to let people tweak what exactly computers are doing, forbidding such a practice using safe secure boot mandatory mode is already planed. Only nazi pedophiles would do otherwise anyway, thus the enactment of the companion law is an obvious go to.
[1] https://news.ycombinator.com/item?id=47997947
2 replies →
Javascript
Rust is legit one of the best languages to "vibe code" in.
The emitted AST has a lower defect rate since it incorporates strong types and in-built error handling. Other pros include native code and portability, but downside is the compile time.
This could be a subjective feeling with no real data to back it up.
People say same about Go as well that it's type system and limited feature set makes it the best AI friendly language but there too, it just seems like a hunch rather than a proven fact.
7 replies →
Downside: CC and Codex will write, compile, and fix in a loop until it has a monstrosity rather than designing something smarter.
Excellent comment.
As a downside, the compile time is somewhat offset once you're using agents (and especially parallel agents) anyway. Since all of your edits cost a round-trip API call to a third party server, you can accept a slightly slower compile step.
> but were prevented from doing so because zig has a hard and fast "no AI code" rule
No, they were prevented from doing so because the Zig devs didn't like the proposed changes and are preparing a more comprehensive improvement.
Even if AI had not been used, the changes would not have been upstreamed, see https://ziggit.dev/t/bun-s-zig-fork-got-4x-faster-compilatio... tl;dr the supposed improvements are not sound and the zig compiler has already gotten a whole lot faster
This should be the top comment in the whole thread. AI is not the point, the PR is just not of a good quality.
What a sober, detailed forum post.
Thanks, that is the answer.
That is a devastating comment. I will now be extremely skeptical of bun.
The Zig maintainers did a pretty in-depth review of the PR, and laid out multiple technical reasons for why it would not get merged. They did not reject it simply for being vibe-coded (though that is likely the cause of it sucking).
Anthropic just needs to buy Zig! Problem solved.
Perfect A/B experiment opportunity. Fork Zig, call the fork Zag.
Lock the syntax/api together for a couple of years. Allow AI code in Zag.
Review after a few years, see which is better.
Interesting experiment, would it actually function if Zag was syntax/api locked to Zig? I guess Zag could still have api extensions.
last zig fork didn't go so well: https://ziglang.org/news/statement-regarding-zen-programming...
Take off every Zig
It's time!
https://xkcd.com/286/
3 replies →
You know what you doing!
...and rewrite it in rs.
>They recently tried to upstream an improvement to zig, but were prevented from doing so because zig has a hard and fast "no AI code" rule.
And will Rust team accept their vibe coded patches?
It depends: https://github.com/rust-lang/rust/pull/155403#issuecomment-4...
Very likely not, if they are of similarly low quality.
No. The Rust project developers are more lenient when it comes to developing patches with AI assistance, but the amount of leniency one receives is proportional to the amount of pre-existing trust a contributor has with the project, and every PR still has to be reviewed by an independent human. A stranger dumping a zillion lines of slop in a PR is a one-way ticket to having your PR politely closed.
Yeah, now that I think about it, having a major project written in a language that doesn't accept AI contributions now owned by a major AI company was a recipe for dis... er, conflict.
I'm not a huge fan of Rust, but I guess having a project like Bun in an actually memory safe language is probably a win? Guess it depends on how good Claude is at writing Rust code...
I see that as a win for Zig.
Read the previous discussions on the topic. Your summary is a sensationalist lie, since their change was apparently a smoking pile of hot garbage, and Zig already had similar performance gains in a newer release.
> They recently tried to upstream an improvement to zig
They didn't.
Not only that but Zig was working on a similar improvement to their change already
seems easier to fork zig
Then that becomes an ongoing effort. The rewrite is once. (Good idea or not)
good, more reason to stay away from zig
Stay away. Everyone wins.
Probably moreso going with the native language that is reliable and battle tested. Rust runs on Firefox, and in production at several systems across major orgs, this is not surprising.