← Back to context

Comment by __MatrixMan__

2 years ago

I certainly don't have that social capital at right now, but I do intend to be hacking around in these areas in the future. If I get something useful working then I will probably be in a position to build that capital.

Because now that we have reframed it as something that puts users in explicit control of their trust graph and not something that would make NixOS like all the other distros, it's a feature that I'd like to use.

----

I'm working on a data annotation layer which I intend to apply as a filesystem and use nix and my test case. Edits are conceived of as annotations, so there's this sense that every file can be built by following a path of edits from the empty file. This makes for slow reads, but near-instant copies (you're just adding a new pointer at the same position in the edit graph as the file you're copying). It's a strange design choice, but it would solve some problems when using nix flakes with large repositories (everything gets copied into the nix store and it can take a while).

Signatures are exactly the sort of thing I was imagining living in an annotation. Ideally, the annotation adheres to patterns in the data and not the named file, so if the package applied a patch which invalidates the signature then the signature doubles as a link to the original, which can be diffed with the patched version and the user can be shown why it fails, not just that it fails. It's a long shot but that's the dream.

With a bit of luck I'll find a more elegant way to handle this without playing whack-a-mole with each user facing utility that builds a derivation. Maybe some kind of dashboard which shows you which files the system is rendering and whether they have associated signatures (or other metadata, known malicious, etc). The challenge, in the signature case, will be knowing which files are ok to be unsigned and which need to fail on read if not signed. Certainly we can't require a separate signature for every file we render.

It might be a long time coming though, this work proceeds on weekends and holidays and it's pretty far from a useful state. I'm still fiddling with tuning the rolling hash based fragmentation algorithm such that files are constructed out of right-sized fragments which end up being reused if the files are similar.