← Back to context

Comment by simiones

2 months ago

The point is still different. In PyPI, if I put `requests` in my requirements.txt, and I run `pip install -r requirements.txt` every time I do `make build`, I will still only get one version of requests - the latest available the first time I installed it. This severely reduces the attack radius compared to NPM's default, where I would get the latest (patch) version of my dependency every day. And the ecosystem being committed to respecting semver is entirely irrelevant to supply chain security. Malicious actors don't care about semver.

Overall, publishing a new malicious version of a package is a much lesser problem in virtually any ecosystem other than NPM; in NPM, it's almost an automatic remote code execution vulnerability for every NPM dev, and a persistent threat for many NPM packages even without this.

> This severely reduces the attack radius compared to NPM's default, where I would get the latest (patch) version of my dependency every day.

By default npm will create a lock file and give you the exact same version every time unless you manually initiate an upgrade. Additionally you could even remove the package-lock.json and do a new npm install and it still wouldn't upgrade the package if it already exists in your node_modules directory.

Only time this would be true is if you manually bump the version to something that is incompatible, or remove both the package-lock.json and your node_modules folder.

  • Ahh this might explain the behavior I observed when running npm install from a freshly checked out project where it basically ignored the lock file. If I recall in that situation the solution was to run an npm clean install or npm ci and then it would use the lock file.

Generally you have the right of it, but a word of caution for Pythonistas:

> The point is still different. In PyPI, if I put `requests` in my requirements.txt, and I run `pip install -r requirements.txt` every time I do `make build`, I will still only get one version of requests - the latest available the first time I installed it.

Only because your `make build` is a custom process that doesn't use build isolation and relies on manually invoking pip in an existing environment.

Ecosystem standard build tools (including pip itself, using `pip wheel` — which really isn't meant for distribution, but some people seem to use it anyway) default to setting up a new virtual environment to build your code (and also for each transitive dependency that requires building — to make sure that your dependencies' build tools aren't mutually incompatible, or broken by other things in the envrionment). They will read `requests` from `[project.dependencies]` in your pyproject.toml file and dump the latest version in that new environment, unless you use tool-specific configuration (or of course a better specification in pyproject.toml) to prevent that. And if your dependencies were only available as sdists, the build tool would even automatically, recursively attempt to build those, potentially running arbitrary code from the package in the process.

> every time I do `make build`

I'm going to assume this is you running this locally to generate releases, presumably for personal projects?

If you're building your projects in CI you're not pulling in the same version without a lockfile in place.