Comment by nindalf

10 months ago

Marcan links to an email by Ted Tso'o (https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.ed...) that is interesting to read. Although it starts on a polarising note ("thin blue line"), it does a good job of explaining the difficulties that Linux maintainers face and why they make the choices they do.

It makes sense to be extremely adversarial about accepting code because they're on the hook for maintaining it after that. They have maximum leverage at review time, and 0 leverage after. It also makes sense to relax that attitude for someone in the old boys' network because you know they'll help maintain it in the future. So far so good. A really good look into his perspective.

And then he can't help himself. After being so reasonable, he throws shade on Rust. Shade that is just unfortunately, just false?

- "an upstream language community which refuses to make any kind of backwards compatibility guarantees" -> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.

- "which is actively hostile to a second Rust compiler implementation" - except that isn't true? Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog -> "The amount of help we have received from Rust folks is great, and we think gccrs can be an interesting project for a wide range of users." (https://blog.rust-lang.org/2024/11/07/gccrs-an-alternative-c...)

This is par for the course I guess, and what exhausts folks like marcan. I wouldn't want to work with someone like Ted Tso'o, who clearly has a penchant for flame wars and isn't interested in being truthful.

> And then he can't help himself. After being so reasonable, he throws shade on Rust. Shade that is just unfortunately, just false?

Many discussions online (and offline) suffer from a huge-group of people who just can't stop themselves from making their knee-jerk reactions public, and then not thinking about it more.

I remember the "Filesystem in Rust" video (https://www.youtube.com/watch?v=WiPp9YEBV0Q&t=1529s) where there are people who misunderstand what the "plan" is, and argue against being forced to use Rust in the Kernel, while the speaker is literally standing in front of them and saying "no one will be forced to use Rust in the Kernel".

You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.

I personally don't know how to deal with this either, and tend to just leave/stop responding when it becomes clear people aren't looking to collaborate/learn together, but instead just wanna prove their point somehow and that's the most important part for them.

  • If you watch that YouTube link, you'll see the same guy Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust". I think he apologised for this flagrant comment, but this email shows he hasn't changed his behaviour in the slightest.

    • His email seems very reasonable to me (the thin-blue-line comment is a bit weird though). To me the problem are that some Rust people seem to expect that the Linux maintainers (that put in a tremendous amount of work) just have to go out of their way to help them achieve their goals - even if the maintainers are not themselves convinced about it and later have to carry the burden.

      41 replies →

    • > Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust"

      Given the online temper tantrum thrown by marcan, Ted Tso'o's comment seems totally reasonable, regardless of one's opinion of Rust in the Linux kernel.

      11 replies →

  • We have used that video as an exercise in how not to achieve change. Assuming everyone is acting in good faith, the presenter missed the opportunity to build consensus before the talk, Tsu unwilling to budge a bit, but most of all the moderator unable to prevent the situation from exploding. This could have been handled much better by each of them.

    In contrast to the parent: yes, the presenter says „you don’t have to use rust, we are not forcing you“ but he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.

    • >In contrast to the parent: yes, the presenter says „you don’t have to use rust, we are not forcing you“ but he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.

      He did not fail to address that concern. And then Ted shouted him down for 2 minutes such that he couldn't get 2 syllables in to respond.

    • > We have used that video as an exercise in how not to achieve change

      I'm not disagreeing with anything you said, just curious who the "we" you're referring to here, are you a kernel developer or something similar?

    • > Assuming everyone is acting in good faith

      Why would we assume that Ted repeatedly using strawman fallacies, bleating appeals to emotion and acting like a victim...all the while shouting people down...evidence of "acting in good faith"?

      When you shout over someone like that you're nothing but a bully.

      > he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.

      Because that "concern" was a strawman. It demonstrated that Ted either did not understand what the presenters were asking for, or simply didn't like others asking him to do something, because he's very important and nobody tells him what to do.

      As has been exhaustively explained by others in previous HN threads and elsewhere: the Rust developers were asking to be informed of changes so that Rust developers could update their code to accommodate the change.

      Ted loses his shit and starts shouting nonsense about others forcing people to learn Rust, and so on.

      > but most of all the moderator unable to prevent the situation from exploding

      When someone is being abusive to others, the issue is never "the people on the receiving end are not handling it as best they can."

      Further: did it occur to you that Ted's infamous short temper, and his "status" as a senior kernel developer, might be why the moderator was hesitating to respond?

      Imagine how Ted would have reacted if he was told to speak respectfully, lower his voice, and stop talking over others. Imagine how the army of nerds who think Ted's behavior was acceptable or understandable.

      6 replies →

  • > "no one will be forced to use Rust in the Kernel"

    Is this true, though? One reason for this altercation seems to be the basic circumstance that in Linux kernel development, if there is a dependency between two pieces of code A and B, the responsibility to keep B consistent with changes to A lies, in order, with anyone proposing patches to A, the subsystem maintainer for A, and finally the subsystem maintainer for B. If B is Rust code, such as a binding, then that's potentially up to 3 people who don't want to use Rust being forced to use Rust.

    • They're not "forced to use Rust". They are maybe forced to work with Rust developers of whichever subsystem needs to be updated, but that would always have been the case with the C developers of whichever subsystem needs to be updated too.

      4 replies →

    • It's absolutely not true, it's one of the lies being told by Rust 4 Linux people. The end goal is absolutely to replace every last line of C code with Rust, and that's what they will openly tell you if you speak to them behind closed doors. That's why there is always an implicit threat directed at the C maintainers about job loss or "being on the right side of history". The Rust 4 Linux people are absolutely attempting a hostile takeover and nobody should believe a word that comes out of their mouths in public mailing lists when they are contradicting it so consistently behind closed doors.

  • > You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.

    This is like probably 80% of people and fundamentally why the world is a hellscape instead of a utopia.

  • The speaker doesn't understand the audience question and doesn't respond to it.

    The audience member points out that they shouldn't encode the semantics into the Rust type system because that would mean that refactoring the C code breaks Rust, which is not an acceptable situation. The speaker responds to this by saying essentially "tell me what the semantics are and I'll encode them in the Rust type system." That's maximally missing the point.

    The proposal would cause large classes of changes to C to break the build, which would dramatically slow down kernel development, even if a small handful of Rust volunteers agree to eventually come in and fix the build.

    > You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.

    I have to say that I used to be excited about Rust, but the Rust community seems very toxic to me. I see a lot of anger, aggression, vindictiveness, public drama, etc. On HN you not infrequently see down voting to indicate disagreement. These clashes with the Linux maintainers look really bad for Rust to me. So bad that I'm pretty convinced Rust as a language is over if they're no longer banging on the technical merits and are instead banging on the table.

    I'm sure there are great things about the community. But I would encourage the community to have higher standards of behavior if they want to be taken seriously. The Linux team seem like they're trying to look beyond the childishness because they are optimistic about the technical merits, but they must be so tired of the drama.

    • > I have to say that I used to be excited about Rust, but the Rust community seems very toxic to me. I see a lot of anger, aggression, vindictiveness, public drama, etc.

      I had the same impression.

      Why all this drama is 90% of the time around Rust people?

      8 replies →

    • > The audience member points out that they shouldn't encode the semantics into the Rust type system because that would mean that refactoring the C code breaks Rust, which is not an acceptable situation. The speaker responds to this by saying essentially "tell me what the semantics are and I'll encode them in the Rust type system." That's maximally missing the point.

      You have to encode your API semantics somewhere.

      Either you encode them at the type system and find out when it compiles, or you encode it at runtime, and find out when it crashes (or worse, fails silently).

    • I disagree, they didn't straight out pointed this, because this is nonsense. Semantic changes can break anything, even if it's some intermediary API.

      There are more breakage in rust due to the type-system-related semantics, but ideally a C dev would also want their system to break if the semantics aren't right. So this is a criticism on C..?

      So following this argument, they don't want Rust because C falls short? Nonsense.

      edit: The speaker did mention that they didn't want to force limited use on the base APIs, but that for a great deal of their usage, they could have determined fixed semantics, and make intermediary APIs for it. So this was not about limiting the basic APIs.

      1 reply →

  • > You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.

    I think that's part of the gag.

    "These people are members of a community who care about where they live... So what I hear is people caring very loudly at me." -- Leslie Knope

    https://www.youtube.com/watch?v=areUGfOHkMA

    • >"These people are members of a community who care about where they live... So what I hear is people caring very loudly at me." -- Leslie Knope

      that's a very healthy and - I feel - correct attitude towards this kind of criticism. I love when wisdom comes from stupid places.

      12 replies →

  • > You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.

    It's called a strawman fallacy, and like all fallacies, it's used because the user is either intellectually lazy and can't be bothered to come up with a proper argument, or there isn't a proper argument and the person they're using it against is right.

    • If an honest alien says "we don't want to convert humans to our religion" that means you can have whatever religion you want. If a dishonest alien says it, it might mean "we don't want to convert humans because we are going to kill all humans", it's selectively true - they aren't going to convert us - and leaves us to imagine that we can have our own religion. But it's not the whole truth and we actually won't be able to[1].

      An honest "no one will be forced to use Rust in the Kernel" would be exactly what it says. A paltering reading could be "we want to make Rust the only language used in the Kernel but you won't be forced to use it because you can quit". i.e. if you are "literally shoving facts in someone's face" and they don't change then they might think you are not telling the whole truth, or are simply lying about your goals.

      [1] https://en.wikipedia.org/wiki/Paltering

      1 reply →

> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.

Unfortunately OP has a valid point regarding Rust's lack of commitment to backwards compatibility. Rust has a number of things that can break you that are not considered breaking changes. For example, implementing a trait (like Drop) on a type is a breaking change[1] that Rust does not consider to be breaking.

[1]: https://users.rust-lang.org/t/til-removing-an-explicit-drop-...

  • I think we're mixing 2 things here: language backward-compatibility, vs. standard practices about what semver means for Rust libraries. The former is way stronger than the latter.

    • > language backward-compatibility, vs. standard practices about what semver means

      I've read and re-read this several times now and for the life of me I can't understand the hair you're trying to split here. The only reason to do semantic versioning is compatibility...

      8 replies →

  • I was hit by a similar thing. Rust once caused regression failures in 5000+ packages due to incompatibility with older "time" packages [1]. It was considered okay. At that point, I don't care what they say about semver.

    [1]: https://github.com/rust-lang/rust/issues/127343#issuecomment...

    • The comment you linked to explicitly shows that a maintainer does not consider this "okay" at all. T-libs-api made a mistake, the community got enraged, T-libs-api hasn't made such a mistake since. The fact that it happened sucks, but you can't argue that they didn't admit the failure.

      8 replies →

    • That was a mistake and a breakdown in processes that wasn't identified early enough to mitigate the problem. That situation does not represent the self imposed expectations on acceptable breakage, just that we failed to live up to it and by the time it became clearer that the change was problematic it was too late to revert course because then that would have been a breaking change.

      Yes: adding a trait to an existing type can cause inference failures. The Into trait fallback, when calling a.into() which gives you back a is particularly prone to it, and I've been working on a lint for it.

      9 replies →

    • > At that point, I don't care what they say about semver.

      Semver, or any compatibility scheme, really, is going to have to obey this:

      > it is important that this API be clear and precise

      —SemVer

      Any detectable change being considered breaking is just Hyrum's Law.

      (I don't want to speak to this particular instance. It may well be that "I don't feel that this is adequately documented or well-known that Drop isn't considered part of the API" is valid, or arguments that it should be, etc.)

  • Implementing (or removing) Drop on a type is a breaking change for that type's users, not the language as a whole. And only if you actually write a trait that depends on types directly implementing Drop[0].

    Linux breaks internal compatibility far more often than people add or remove Drop implementations from types. There is no stability guarantee for anything other than user-mode ABI.

    [0] AFAIK there is code that actually does this, but it's stuff like gc_arena using this in its derive macro to forbid you from putting Drop directly on garbage-collectable types.

    • > is a breaking change _for that type's users_, not the language as a whole.

      And yet the operating mantra...the single policy that trumps all others in Linux kernel development...

      is don't break user space.

      1 reply →

    • > Linux breaks internal compatibility far more often than people add or remove Drop implementations from types. There is no stability guarantee for anything other than user-mode ABI.

      I think that's missing the point of the context though. When Linux breaks internal compatibility, that is something the maintainers have control over and can choose not to do. When it happens to the underlying infrastructure the kernel depends on, they don't have a choice in the matter.

  • Removing impl Drop is like removing a function from your C API (or removing any other trait impl): something library authors have to worry about to maintain backwards compatibility with older versions of their library. A surprising amount of Rust's complexity exists specifically because the language developers take this concern seriously, and try to make things easier for library devs. For example, people complain a lot about the orphan rules but they ensure adding a trait impl is never a breaking change.

  • Meaning of this code has not changed since Rust 1.0. It wasn't a language change, nor even anything in the standard library. It's just a hack that the poster wanted to work, and realized it won't work (it never worked).

    This is equivalent of a C user saying "I'm disappointed that replacing a function with a macro is a breaking change".

    Rust had actual changes that broke people's code. For example, any ambiguity in type inference is deliberately an error, because Rust doesn't want to silently change meaning of users' code. At the same time, Rust doesn't promise it won't ever create a type inference ambiguity, because it would make any changes to traits in the standard library almost impossible. It's a problem that happens rarely in practice, can be reliably detected, and is easy to fix when it happens, so Rust chose to exclude it from the stability promise. They've usually handled it well, except recently miscalculated "only one package needed to change code, and they've already released a fix", but forgot to give users enough time to update the package first.

  • I'm curious now. What are the backwards compatibility guarantees for C?

    • As long as you compile with the version specified (e.g., `-std=c11`) I think backwards compatibility should be 100%. I've been able to compile codebases that are decades old with modern compilers with this.

      16 replies →

    • For the purposes of linux kernel, there's essentially a custom superset of C that is defined as "right" for linux kernel, and there are maintainers responsible for maintaining it.

      While GCC with few basic flag will, in general, produce binary that cooperates with kernel, kbuild does load all those flags for a reason.

      2 replies →

    • The backwards compatibility guarantee for C is "C99 compilers can compile C99 code". If they can't, that's a compiler bug. Same for other C standards.

      Since Rust doesn't have a standard, the guarantee is "whatever the current version of the compiler can compile". To check if they broke anything they compile everything on crates.io (called a crater run).

      But if you check results of crater runs, almost every release some crates that compiled in the previous version stop compiling in the new version. But as long as the number of such breakages it not too large, they say "nothing is broken" and push the release.

      10 replies →

> "which is actively hostile to a second Rust compiler implementation" - except that isn't true?

Historically the Rust community has been extremely hostile towards gccrs. Many have claimed that the work would be detrimental to Rust as a language since it would split the language in two (despite gccrs constantly claiming they're not trying to do that). I'm not sure if it was an opinion shared by the core team, but if you just browse Reddit and Twitter you would immediately see a bunch of people being outright hostile towards gccrs. I was very happy to see that blog post where the Rust leadership stepped up to endorse it properly.

Just one reference: In one of the monthly updates that got posted on Reddit (https://old.reddit.com/r/rust/comments/1g1343h/an_update_on_...) a moderator had to write this:

> Hi folks, because threads on gccrs have gotten detailed in the past, a reminder to please adhere to the subreddit rules by keeping criticism constructive and keeping things in perspective.

  • The LKML quote is alleging that the upstream language developers (as opposed to random users on Reddit) are opposed to the idea of multiple implementations, which is plainly false, as evidenced by the link to the official blog post celebrating gccrs. Ted T'so is speaking from ignorance here.

    • I think it’s more pointed towards people like me who do think that gccrs is harmful (I’m not a Rust compiler/language dev - just a random user of the language). I think multiple compiler backends are fine (eg huge fan of rustc_codegen_gcc) but having multiple frontends I think can only hurt the ecosystem looking at how C/C++ have played out vs other languages like Swift, Typescript etc that have retained a single frontend. In the face of rustc_codegen_gcc, I simply see no substantial value add of gccrs to the Rust ecosystem but I see a huge amount of risk in the long term.

    • (emphasis mine)

      > opposed to the idea of multiple implementations, which is plainly false, as evidenced by the link to the official blog post celebrating gccrs. Ted T'so is speaking from ignorance here.

      Why use so strong words? Yes, there's clearly a misunderstanding here, but why do we need to use equally negative words towards them? Isn't it more interesting to discuss why they have this impression? Maybe there's something with the communication from the upstream language developers which hasn't been clear enough? It's a blog post which is a few months old so if that's the only signal it's maybe not so strange that they've missed it?

      Or maybe they are just actively lying because they have their own agenda. But I don't see how this kind of communication, assuming the worst of the other part, beings us any closer.

      1 reply →

  • For whatever it's worth, I did believe that some of the Rust team was very hostile towards gccrs, but that behavior has completely changed, and it seems like they're receiving a lot of support these days.

    Reddit... is reddit.

  • > > > Hi folks, because threads on gccrs have gotten detailed in the past

    Here's guessing they meant "derailed".

These are often the same classification of individual who tend to modify their viewpoints towards “change is progressing rapidly in an area that I don’t understand and this scares me.” Anytime an expert in a particular area has their expertise challenged or even threatened by a new technology it is perfectly human to react in a way that is defensive towards the perceived threat. Part of growth as a human is recognizing our perceived biases and attempting to mitigate them, hopefully extending this into other areas of our lives as well. After all, NIMBYS probably started out with reasonable justifications for why they want to keep their communities the way they currently are - it’s comfortable and it works, and they’re a significant contributor to the community. Any external “threat” to this concept becomes elevated to a moral crusade against the invaders who are encroaching upon their land, when really they’re jousting against windmills.

  • Or "change is progressing rapidly in an area I am working 20 years and I have seen this kind of thing failing before"

    • I think confronting those volunteers that maintain open-source software with arguments such as "you just do not want to learn new things" , "are scared of change", etc. is very unfair. IMHO new ideas should prove themselves and not pushed through because google wants it or certain enthusiastic groups believe this is the future. If Rust is so much better, people should just build cool stuff and then it will be successful anyway.

      8 replies →

    • Fair, I acknowledge I may have misrepresented this group who are against the Rust community as not being experts in this this space; they certainly are. Rust doesn’t have to be the answer but if we treat others (namely Rust supporters) and their solutions as dead-on-arrival because it’s implemented in a technology we’re not entirely familiar with how can we get to a point where we’re solving difficult problems? Especially if we create an unwelcoming space for contribution?

  • > After all, NIMBYS probably started out with reasonable justifications for why they want to keep their communities the way they currently are

    Bad example IMO. What is reasonable about this? http://radicalcartography.net/bayarea.html

    • May be a poor example, it’s what came to mind initially. I don’t think the end results are at all the same but I think the initial emotions around why you may balk at something new entering your community have parallels to the topic at hand.

> Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.

This is a very persistent myth, but it’s wrong. Adding any public method to any impl can break BC (because its name might conflict with a user-defined method in a trait), and the Rust project adds methods to standard library impls all the time.

  • This is true, strictly speaking, but rarely causes problems. Inherent methods are prioritized over trait methods, so this only causes problems if two traits suddenly define a single method, and the method is invoked in an ambiguous context.

    This is a rare situation, and std thrives to prevent it. For example, in [1], a certain trait method was called extend_one instead of push for this reason. Crater runs are also used to make sure the breakage is as rare as T-libs-api has expected. The Linux kernel in particular only uses core and not std, which makes this even more unlikely.

    [1]: https://github.com/rust-lang/rust/issues/72631

    • Okay, but “they try to avoid issues” is not the same as “they guarantee never to intentionally break BC except to fix compiler bugs”.

  • That's just not true. If the user has defined on the struct and a trait also has the method name, the struct's impl is used. Multiple traits can have methods named the same too.

    https://play.rust-lang.org/?version=stable&mode=debug&editio...

    • My comment might have been technically wrong as originally stated; I’ve since edited to try to correct/clarify.

      What I really meant is the case where a method is added to a standard struct impl that conflicts with a user-defined trait.

      For example, you might have implemented some trait OptionExt on Option with a method called foo. If now a method called foo is added to the standard option struct, it will conflict.

      3 replies →

> which is actively hostile to a second Rust compiler implementation

Which is hilarious since Linux itself was actively hostile to the idea of a second C compiler supporting it. Just getting Linux to support Clang instead of only GCC was a monumental task that almost certainly only happened because Android forced it to happen.

  • It happened because the Android people put in the work to make it happen both in Linux and in Clang/LLVM.

    • Putting in the work is one thing, which is what the Rust-in-Linux people are also doing, but there's also the political requirement to force maintainers to accept it. Android was big enough, and happy enough to fork seeing as it had already done that before, that it forced a lot of hands with top-down mandates.

      Rust, despite having Linus' blessing to be in the kernel, is still just getting rejected just because it's Rust, completely unrelated to any technical merits of the code itself.

Thank you to share the Ted T'so LKM post. Can you explain the culture reference "thin blue line"? I never heard it before.

  • It's a motto used by American law enforcement to justify extrajudicial punishment. Since they are the "thin blue line" that separates the public from anarchy, they are justified in acting independently to "protect" us when judges and juries do not "cooperate".

    • Not just extrajudicial punishment, but overlooking corrupt acts and crimes from fellow officers. That it's more important to maintain the 'brotherhood' than to arrest an officer caught driving home drunk.

    • No, that's not really true.

      Directly, "the thin blue line" expresses the idea that the police are what separates society from chaos.

      It doesn't inherently suggest police are justified in acting outside the law themselves, though, of course, various people have suggested this (interestingly, from both a pro-police and anti-police perspective).

      It seems obvious to me that the post was using this phrase in the sense of being a thin shield from chaos.

    • That is a very strange take. The phrase isn't American and has no negative connotation. It has nothing to do with "extrajudicial punishment". It simply refers to the (obvious) fact that what separates societies from anarchy is the "thin blue line" of law enforcement.

      Rowan Atkinson had a sitcom set in a London police station in the 90s called "The Thin Blue Line". Are you under the impression he was dogwhistling about extrajudicial violence?

      4 replies →

  • In the US, this is a reference to the belief that members of law enforcement should be loyal folirst to other members of law enforcement and only secondarily to the law. Or at least that is how I have always understood it.

    • It seems obvious that that’s not what Ted intended it to mean, since it wouldn’t even make sense in this context (the debate doesn’t really seem to be about whether maintainers should be loyal to other maintainers).

      A more charitable interpretation would be “we’re the only line of defense protecting something good and valuable from the outside world, so people should give significant weight to our opinions and decisions”. Which, to be clear, I would still mostly disagree with WRT the police, but it at least doesn’t explicitly endorse corruption.

      9 replies →

> - "an upstream language community which refuses to make any kind of backwards compatibility guarantees" -> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.

The most charitable interpretation I can imagine is that the Rust-in-Linux project needs specific nightly features, and those don't get stability guarantees. But I think this is still pretty unfair to complain about; my impression is there's a lot of appetite on the Rust side to get those stabilized.

I also think...

> we know, through very bitter experience, that 95+% of the time, once the code is accepted, the engineers which contribute the code will disappear, never to be seen again.

...that while there's truth in this, there's also a large extent to which it's a self-fulfilling prophecy. Someone might want to stick it out to get their work into mainstream once, but then take a look at the process once it's in the mirror and say never again.

...and:

> Instead of complaining about maintainers for who are unreasonably caring about these things, when they are desparately under-resourced to do as good of a job as they industry demands, how about meeting us half-way and helping us with these sort of long-term code health issues?

It's really hard for me to not see "let's actually write down the contract for these functions, ideally via the type system" as doing exactly that. Which seems to me to be the central idea Ted Ts'o was ranting about in that infamous video.

If comments as benign as "thin blue line" causes fragile entryist/activists to flee, I say Ted and the kernel team are doing the right thing. Projects as critical as the Linux kernel shouldn't be battlegrounds for the grievance of the week, nor should they be platforms for proselytizing. Marcan and others like him leave long paths of destruction in their wake. Lots of projects have been turned upsidedown by the drama they seem to bring with them everywhere. The salient point is contributors need to be more than "drive by" submitters for their pet projects. This isn't specific to Rust in the kernel, look at how much of an uphill battle bcachefs was/is.

  • I didn't even know what the whole issue with the "thin blue line" comment was until I read this thread. I was never under the impression "thin blue line" was about corruption or brutality, I think people are conflating "thin blue line" with "blue lives matter", which is an entirely different subject.

  • Quite wild to see this being downvoted, because by downvoting, surely one implies the inverse of your post to be the truth, such that projects such as the Linux kernel should be battlegrounds for the grievance of the week, should be platforms for proselytizing, and so forth.

    Very strange to see little to no empathy for kernel maintainers in this situation.

> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.

The community is more than just the language and compiler vendor(s). It's everyone using the language, with particular emphasis on the developers of essential libraries and tools that those users use and on which they're reliant.

In this sense, based on every time I've attempted to use Rust (even after 1.0), Ts'o's remark ain't inaccurate from what I can tell. If I had a nickel for every Rust library I've seen that claims to only support Rust Nightly, I'd have... well, a lot of nickels. Same with Rust libraries not caring much about backward-compatibility; like yeah, I get it during pre-1.0, or while hardly anyone's using it, but at some point people are using it and you are signaling that your library's "released", and compatibility-breaking changes after that point make things painful for downstream users.

> Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog

Same deal here. The Rust developers might be welcoming of additional implementations, but the broader community might not be. I don't have enough information to assess whether the Rust community is "actively hostile" to a GCC-based Rust implementation, but from what I can tell there's little enthusiasm about it; the mainstream assumption seems to be that "Rust" and its LLVM-based reference compiler are one and the same. Maybe (hopefully) that'll change.

----

The bigger irony here, in any case, is that the Linux community has both of these very same problems:

- While the kernel itself has strict backwards-compatibility guarantees for applications, the libraries those applications use (including absolutely critical ones like glibc) very much do not. The ha-ha-only-serious observation in the Linux gaming community is that - thanks to Wine/Proton - the Windows API is the most stable ABI for Linux applications. Yeah, a lot of these issues are addressable with containerization, or by static compilation, but it's annoying that either are necessary for Linux-native applications to work on old and new distros alike.

- As marcan alludes to in the article, the Linux community is at least antipathetic (if not "actively hostile") to Linux-compatible kernels that are not Linux, be they forks of Linux (like Android) or independent projects that support running Linux applications (WSL 1/2, FreeBSD, some illumos distros, etc.). The expectation is that things be upstreamed into "the" Linux, and the norms around Linux development make out-of-tree modules less-than-practical. This is of course for good reason (namely: to encourage developers to contribute back to upstream Linux instead of working in silos), but it has its downsides - as marcan experienced firsthand.

It's also not truthful because many of the Rust maintainers are long time C contributors.

Marcan also linked to this resignation of a Rust Maintainer:

https://lore.kernel.org/lkml/20240828211117.9422-1-wedsonaf@...

which references this fantastic exchange:

https://www.youtube.com/watch?v=WiPp9YEBV0Q&t=1529s

I am not a C person, or a kernel level person, I just watch this from the sideline to learn something every now and then (and for the drama). But this exchange is really stunning to me. It seems so blatantly obvious to me that systematically documenting (in code!) and automatically checking semantic information that is required to correctly use an API is a massive win. But I have encountered this type of resistance (by very smart developers building large systems) in my own much smaller and more trivial context. To some degree, the approach seems to be: "If I never write down what I mean precisely, I won't have to explain why I changed things." A more charitable reading of the resistance is: Adding a new place where the semantics are written down (code, documentation and now type system) gives one more way in which they can be out of sync or subtly inconsistent or overly restrictive.

But yeah, my intuitive reaction to the snippet above is just incredulity at the extreme resistance to precisely encoding your assumptions.

  • Your charitable reading is too charitable. One of the benefits of using types to help guarantee properties of programs (e.g. invariants) is that types do not get out of sync with the code, because they are part of the code, unlike documentation. The language implementation (e.g. the compiler) automatically checks that the types continue to match the rest of the code, in order to catch problems as early as possible.

    • I'm not a kernel developer, and never done anything of the sorts either. But, I think the argument is that if they have two versions of something (the C version + the Rust bindings), the logic/behavior/"semantics" of the C version would need to be encoded into the Rust types, and if a C-only developer changes the C version only, how are they supposed to proceed with updating the Rust bindings if they don't want to write Rust?

      At least that's my understanding from the outside, someone please do correct me if wrong.

      17 replies →

    • Yes, but generic code complicates the picture. The things I saw were like: The documentation says you need a number but actually all you need is for the + operator to be defined. So if your interface only accepts numbers it is unnecessarily restrictive.

      Conversely some codepath might use * but that is not in the interface, so your generic code works for numbers but fails for other types that should work.

      2 replies →

  • It's practically impossible to document all your assumptions in the type system. Attempting to do so results in code that is harder to read and write.

    You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.

    I am not against static safety, but there are trade offs. And types are often not the best way to achieve static safety.

    • > And types are often not the best way to achieve static safety.

      That’s a sort of weird statement to make without reference to any particular programming language. Types are an amazing way to achieve static safety.

      The question of how much safety you can reasonably achieve using types varies wildly between languages. C’s types are pretty useless for lots of reasons - like the fact that all C pointers are nullable. But moving from C to C++ to Rust to Haskell to ADA gives you ever more compile time expressivity. That type expressivity directly translates into reduced bug density. I’ve been writing rust for years, and I’m still blown away by how often my code works correctly the first time I run it. Yesterday the typescript compiler (technically esbuild) caught an infinite loop in my code at compile time. Wow!

      I’d agree that every language has a sweet spot. Most languages let you do backflips in your code to get a little more control at compile time at the expense of readability. For example, C has an endless list of obscure __compiler_directives that do all sorts of things. Rust has types like NonZeroUsize - which seem like a good idea until you try it out. It’s a good idea, but the ergonomics are horrible.

      But types can - and will - take you incredibly far. Structs are a large part of what separates C from assembler. And types are what separates rust from C. Like sum types. Just amazing.

    • Encoding assumptions and invariants in the type system is a spectrum. Rust, by it's very nature, places you quite far along that spectrum immediately. One should consider if the correctness achieved by this is worth the extra work. However, if there is one place where correctness is paramount, surely it's the Linux Kernel.

      > [..]Attempting to do so results in code that is harder to read and write.

      > You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.

      I don't think you are expressing objective truth, this is all rather subjective. I find code that encodes many assumptions in the type system beautiful and obvious. In part this is due to familiarity, of course something like this will seem inscrutable to someone who doesn't know Rust, in the same way that C looks inscrutable to someone who doesn't know any programming.

      1 reply →

    • But if the info is info the user of your code needs in order to interface correctly, the point that you can't document everything is moot. You already have to document this in the documentation anyways.

> It makes sense to be extremely adversarial about accepting code because they're on the hook for maintaining it after that. They have maximum leverage at review time, and 0 leverage after.

I don't follow. The one with zero leverage is the contributor, no? They have to beg and plead with the maintainers to get anything done. Whereas the maintainers can yank code out at any time, at least before when the code makes it into an official stable release. (Which they can control - if they're not sure, they can disable the code to delay the release as long as they want.)

  • Maintainers can't yank out code if that leads to feature, performance or user space regressions.

That's useful context because as a complete laymen I thought his message was largely reasonable (albeit I am not unsympathetic to the frustration of being on the other side)!

  • Here is what I learned the hard way: the request sounds reasonable. And that doesn’t matter (sucks, I know.)

    Here is the only thing that matters in the end (I learned this an even harder way. I really worked like the L4R people approach this and was bitten by counter-examples left, right, and center): The Linux Kernel has to work. This is even more important than knowing why it works. There is gray area and you only move forward by rejecting anything that doesn’t have at least ten years of this kind of backwards compatible commitment. All of it. Wholesale. (And yes, this blatantly and callously disregards many gods efforts sounding like the tenuous and entitled claim "not good enough".)

    But it’s the only thing that has a good chance of working.

    Saying that gravity is a thing is not the same attitude as liking that everyone is subject to gravity. But hoping that gravity just goes away this once is wishful thinking of the least productive kind.

    Rust is not "sufficiently committed" to backwards compatibility. Firstly, too young to know for sure and the burden is solely on "the rust community" here. (Yes, that sucks. Been there.)

    Secondly, there were changes (other posters mentioned "Drop") and how cargo is treated that counter indicate this.

    Rust can prove all the haters wrong. They will then be even more revered that Linux and Debian. But they have to prove this. That is a time consuming slog. With destructive friction all the way.

    This is the way.

> Marcan links to an email by Ted Tso'o (https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.ed...) that is interesting to read. Although it starts on a polarising note ("thin blue line")

Can I say that I was immediately put off by the author conflating the "thin blue line" quote from with a political orientation?

The full quote (from the article) being: "Later in that thread, another major maintainer unironically stated “We are the ‘thin blue line’”, and nobody cared, which just further confirmed to me that I don’t want to have anything to do with them."

The way I read it, "thin blue line" is being used as a figure of speech. I get what they are referring to and I don't see an endorsement. It doesn't necessarily means a right-wing affiliation or sympathy.

To me it seems like the author is projecting a right-wing affiliation and a political connotation where there is none (at least not officially, as far as I can see on https://thunk.org/tytso/) in order to discredit Theodore Ts'o. Which is a low point, because attacking Ts'o on a personal level means Martin is out of ammunitions to back their arguments.

But then again, Hector Martin is the same person that though that brigading and shaming on social media is an acceptable approach to collaboration in the open source space:

    "If shaming on social media does not work, then tell me what does, because I'm out of ideas."

from https://lkml.org/lkml/2025/2/6/404

To me, from outside, Hector Martin looks like a technically talented but otherwise toxic person that is trying to use public shaming on social media and ranting on his blog as tools and tactics to impose their will and force the otherwise democratic process of development the linux kernel. And the on top of everything it's behaving like a victim.

It's a good thing they are resigning, in my opinion.

  • Thank you for pointing this out—willfully, uncharitably misinterpreting “thin blue line” as used by Ts'o demonstrates a severe lack of empathy for people in his position.

    Jumping to conclusions about police brutality and so forth (as many here in the comments are doing) is very frustrating to see, because, in context, the intent of his phrasing is very clear to anyone who doesn't needlessly infer Contemporary Political Nonsense in literally everything they read.

Perhaps merge requests should have to go through a process of learning the codebase first, and submitting increasingly more complex fixes before jumping to really complex requests.

It can be hard when solving your own acute issue - doing so doesn't mean it is the only fix or the one the project should accept.

Even if it's beneath someone's talent to have to do it, it is an exercise of community building.

> This is par for the course I guess, and what exhausts folks like marcan. I wouldn't want to work with someone like Ted Tso'o, who clearly has a penchant for flame wars and isn't interested in being truthful.

I am acquainted with Ted via the open source community, we have each other on multiple social media networks, and I think he's a really great person. That said, I also recognize when he gets into flame wars with other people in the open source social circles, and sometimes those other people are also friends or acquaintances.

I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable. There are a bazillion geniuses out there, and being smart is not good enough anymore in the open source world, one has to overcome those toxic "on the spectrum" tendencies or whatever, and be polite while making reasonable points. This policy extends to conduct as well as words written in email/chat threads. Ted is one of those, along side Linus himself, who has in the past indulged into a bit of shady conduct or remarks, but their arguments are usually compelling.

I personally think of these threads in a way related to calculus of infinitesimals, using the "Standard Parts" function to zero away hyperbolic remarks the same way the math function zeros away infinitesimals from real numbers, sorta leaving the real remarks. This is a problem, because it's people like me, arguably the reasonable people, who through our silence enable these kind of behaviours.

I personally think Ted is more right than wrong, most of the time. We do disagree sometimes though, for example Ted hates the new MiB/KiB system of base-2 units, and for whatever reasons like the previous more ambiguous system of confusingly mixed base-10/base-2 units of MB/Mb/mb/KB/Kb/kb... and I totally got his arguments that a new standard makes something confusing already even more confusing, or something like that. Meh...

  • > Ted hates the new MiB/KiB system of base-2 units, and for whatever reasons like the previous more ambiguous system of confusingly mixed base-10/base-2 units of MB/Mb/mb/KB/Kb/kb

    Here's my best argument for the binary prefixes: Say you have a cryptographic cipher algorithm that processes 1 byte per clock cycle. Your CPU is 4 GHz. At what rate can your algorithm process data? It's 4 GB/s, not 4 GiB/s.

    This stuff happens in telecom all the time. You have DSL and coaxial network connections quantified in bits per second per hertz. If you have megahertz of bandwidth at your disposal, then you have megabits per second of data transfer - not mebibits per second.

    Another one: You buy a 16 GB (real GB) flash drive. You have 16 GiB of RAM. Oops, you can't dump your RAM to flash to hibernate, because 16 GiB > 16 GB so it won't fit.

    Clarity is important. The lack of clarity is how hundreds of years ago, every town had their own definition of a pound and a yard, and trade was filled with deception. Or even look at today with the multiple definitions of a ton, and also a US gallon versus a UK gallon. I stand by the fact that overloading kilo- to mean 1024 is the original sin.

    • > Another one: You buy a 16 GB (real GB) flash drive. You have 16 GiB of RAM. Oops, you can't dump your RAM to flash to hibernate, because 16 GiB > 16 GB so it won't fit.

      Right but the problem here is that RAM is produced in different units than storage. It seems strictly worse if your 16GB of RAM doesn't fit in your 16GB of storage because you didn't study the historical marketing practices of these two industries, than if your 16 GiB of RAM doesn't fit in your 16 GB of storage because at least in the second case you have something to tip you off to the fact that they're not using the same units .

  •     > I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable.
    

    I want to say that I am thankful in this world that I am a truly anonymous nobody who writes codes for closed-source mega corp CRUD apps. Being a tech "public figure" (Bryan Cantrill calls it "nerd famous") sounds absolutely awful. Every little thing that you wrote on the Internet in the last 30 years is permanently recorded (!!!), then picked apart by every Tom, Dick, Harry, and Internet rando. My ego could never survive such a beating. And, yet, here we are in 2025, where Ted T'so continues to maintain a small mountain of file system code that makes the Linux world go "brrr".

    Hot take: Do you really think you could have done better over a 30 year period? I can only answer for myself: Absolutely fucking not.

    I, for one, am deeply thankful for all of Ted's hard work on Linux file systems.

    • There are plenty of "nerd famous" people who manage it by just not being an asshole. If you're already an asshole being "nerd famous" is going to be rough, yes, but maybe just don't be one?

      1 reply →