Comment by rectang
1 year ago
I care less about the quality of the dependencies than about the burden of protecting against supply chain attacks when there are a lot of dependencies.
1 year ago
I care less about the quality of the dependencies than about the burden of protecting against supply chain attacks when there are a lot of dependencies.
In the past, I worked on a project for Luxembourg's CTIE (their IT administration). In most cases, they explicitly requested that we reimplemented features we needed instead of including more third party libraries. They just allowed essential libraries for the project, like Struts for the Web framework, or implementations of standard libraries like JPA, JTA etc. that came with WebSphere. Everything else, we had to reimplement. For them, it was just much easier to manage, given the amount of systems they have to manage. And the allowed libraries were only allowed in versions that they had reviewed before for security issues. In the end, reimplementing features/functions that we could have included with other libraries was never a reason for any problem : this practice requires some additional work, but has never been significant for the ability to deliver the project as expected.
How many people do the security code review with this process? How do they avoid piling dozens of well hidden holes when you not use a library that is publicly available and seen by thousands of eyes?
Isn’t the best argument for open source code that it has so many people, most companies can not afford such a global quality assurance.
Indeed, and that's a good reason to avoid third-party dependencies. But that's irrelevant to the choice of programming language; a language with a bad dependency manager might force you to build everything yourself, but you can always just do that, even in a language with a good dependency manager, you just choose to build everything yourself if you care.
Perplexingly, the original commenter seems to understand that this doesn't matter, and then handwaves away the correct conclusion.
It remains relevant to programming language choice because the "best in class" libraries in Rust often have lots of dependencies, thanks to Rust culture and cargo's design.
I'd like to be able to pick a few libraries without incurring a huge ongoing audit burden. If I have to exclude many popular libraries because they have oodles of dependencies, that both makes searching more laborious and limits my choices.
I still don't understand what alternative people are arguing in favor of. When I think of those "best in class" libraries like regex, serde, etc, those are multiple crates that are developed by the same teams. Having one massive crate or one hundred tiny crates is irrelevant here, because if they're all developed by the same contributors it does not increase your trusted computing base.
8 replies →
The best in class libraries depend on many crates. But crates are often used in workspaces to speed up compilation or split up independent parts.
So how many dependencies are there truly when you peel away the first layer of the onion?
https://doc.rust-lang.org/cargo/reference/workspaces.html
1 reply →
What is the shape of these dependency trees? Is it really hundreds of single-type + single-function crates? Could there ever be a path to scrub out the smaller dependencies and integrate them into larger crates with more concrete functionality?
What's the status of potential distributed code review systems like cargo-crev?
> Is it really hundreds of single-type + single-function crates?
No, and I think this is the crucial thing that people who have experience with NPM overlook when it comes to Rust. Rust emphatically does not have a culture of single-function microlibraries, instead libraries are split out by purpose, in the same way you would modularize a C codebase.
Remember, Rust crates are not just units of distribution, they are also units of translation (a.k.a. compilation units), so the same pressures that cause people to split C projects into multiple files results in people splitting Rust projects into multiple crates.
2 replies →
Well if you look at the most recent open source supply chain attack on openssh, that used social engineering to add a backdoor to a project that openssh did not have a dependency on anywhere in it's SBOM. And with the xz example the backdoor had to be rushed out when it was deployed because the dynamic dependencey was being removed before the backdoor was completely in place. Doing a open source supply chain attack is not easy, fast or reliable for long.
It is not as simple as you say. Sometime it is better to know all of you dependencies are static linked at build time and specified when you are releasing your code. And the more sane you build system is the harder it is to add shellcode to your dependency's tarball and build scripts without turning peoples heads with random unsafe code.
>And with the xz example the backdoor had to be rushed out when it was deployed because the dynamic dependencey was being removed before the backdoor was completely in place. Doing a open source supply chain attack is not easy, fast or reliable for long.
If the xz backdoor had not been found due to dumb luck, it could have persisted for a long time. Backdoors have persisted for years before, maybe even decades. It's also a package with a lot of eyes on it compared to obscure packages. So I don't think you're right even a little bit, especially in huge projects or projects with LOTS of dependencies.
I don't think any of your points detract from the original argument. Having more dependencies just widens the attack surface area, and makes an attack like this easier, depending on the motivation and resources of an MCA.
> Doing a open source supply chain attack is not easy
When projects ship 600 dependencies it's really easy.
> not easy, fast or reliable for long.
It does not need to be long. One day is enough to compromise a system or a thousand.
Thank you.
As someone who works in cybersecurity and works closely with our developers, a lot of them tend to inherently trust third-party code with no auditing of the supply chain. I am always fighting that while yes, we don't need to reinvent the wheel and libraries/packages are important, our organzation and developers need to be aware of what we are bringing into our network and our codebase.
As someone who also works in cybersecurity, we use Rust extensively and are sunsetting all of our C code. We use third-party dependencies judiciously and never deploy anything without auditing it. It's great that Rust facilitates this convenient ecosystem, and it's to Rust's benefit, not Rust's detriment, that the ecosystem exists.
Many orgs want to save costs while using open software. In a world after xz incident they are now in very difficult place. It is all about whom do one trusts. As someone who works in such costs aware business as gamedev I think Rust has unique chance to capture trust market. If major Rust organization sponsors will donate information about even a fraction of what they had audited that can create safe heaven for smaller orgs and startups. Even large orgs that still have history of audits for decades and long list of C/C++/C#/Java projects they trust might buy in. Because it is not reasonable to expect that they can keep up with each open project updates.
> We use third-party dependencies judiciously and never deploy anything without auditing it.
This is how I think it should be of course. Like I said, I'm not against the use of third-party code or dependencies, I'm against using them without performing any audit of that code.
Nothing stops you from vendoring them into your repo and hand update each. But how would you do this in c++? Write everything from scratch? I mean rust doesn’t stopp you there
[edit] typos
Well, no one is forcing you to use these dependencies? Rust crates tend to be very minimal because how easy it is to use them.
The amount of code you have to review stays the same.
So ... then don't use them? No one forces anyone to use any dependencies in rust lol its just faster to use shit thats already made