Comment by westurner

4 months ago

I decided to look into how this works these days.

These days you need toml to parse pyproject.toml, and there's not a parser in the Python standard library for TOML: https://packaging.python.org/en/latest/guides/writing-pyproj...

pip's docs strongly prefer project.toml: https://pip.pypa.io/en/stable/reference/build-system/pyproje...

Over setup.py's setup(,setup_requires=[], install_requires=[]) https://pip.pypa.io/en/stable/reference/build-system/setup-p...

Blaze and Bazel have Skylark/Starlark to support procedural build configuration with maintainable conditionals

Bazel docs > Starlark > Differences with Python: https://bazel.build/rules/language

cibuildwheel: https://github.com/pypa/cibuildwheel ;

> Builds manylinux, musllinux, macOS 10.9+ (10.13+ for Python 3.12+), and Windows wheels for CPython and PyPy;

manylinux used to specify a minimum libc version for each build tag like manylinux2 or manylinux2014; pypa/manylinux: https://github.com/pypa/manylinux#manylinux

A manylinux_x_y wheel requires glibc>=x.y. A musllinux_x_y wheel requires musl libc>=x.y; per PEP 600: https://github.com/mayeut/pep600_compliance#distro-compatibi...

> Works on GitHub Actions, Azure Pipelines, Travis CI, AppVeyor, CircleCI, GitLab CI, and Cirrus CI;

Further software supply chain security controls: SLSA.dev provenance, Sigstore, and the new PyPI attestations storage too

> Bundles shared library dependencies on Linux and macOS through `auditwheel` and `delocate`

delvewheel (Windows) is similar to auditwheel (Linux) and delocate (Mac) in that it copies DLL files into the wheel: https://github.com/adang1345/delvewheel

> Runs your library's tests against the wheel-installed version of your library

Conda runs tests of installed packages;

Conda docs > Defining metadata (meta.yaml) https://docs.conda.io/projects/conda-build/en/latest/resourc... :

> If this section exists or if there is a `run_test.[py,pl,sh,bat,r]` file in the recipe, the package is installed into a test environment after the build is finished and the tests are run there.

Things that support conda meta.yml declarative package metadata: conda and anaconda, mamba and mambaforge, picomamba and emscripten-forge, pixi / uv, repo2docker REES, and probably repo2jupyterlite (because jupyterlite's jupyterlite-xeus docs mention mamba but not yet picomamba) https://jupyterlite.readthedocs.io/en/latest/howto/configure...

The `setup.py test` command has been removed: https://github.com/pypa/setuptools/issues/1684

`pip install -e .[tests]` expects extras_require['tests'] to include the same packages as the tests_require argument to setup.py: https://github.com/pypa/setuptools/issues/267

TODO: is there a new one command to run tests like `setup.py test`?

`make test` works with my editor. A devcontainers.json can reference a Dockerfile that runs something like this:

  python -m ensurepip && python -m pip install -U pip setuptools

But then still I want to run the tests of the software with one command.

Are you telling me there's a way to do an HTTPS Content Range request for the toml file in a wheel for the package dependency version constraints and/or package hashes (but not GPG pubkey fingerprints to match .asc manifest signature) and the build & test commands, but you still need an additional file in addition to the TOML syntax pyproject.toml like Pipfile.lock or poetry.lock to store the hashes for each ~bdist wheel on each platform, though there's now a -c / PIP_CONSTRAINT option to specify an additional requirements.txt but that doesn't solve for windows or mac only requirements in a declarative requirements.txt? https://pip.pypa.io/en/stable/user_guide/#constraints-files

conda supports putting `[win]` at the end of a YAML list item if it's for windows only.

Re: optimizing builds for conda-forge (and PyPI (though PyPI doesn't build packages (when there's a new PR, and then sign each build for each platform))) https://news.ycombinator.com/item?id=41306658