Comment by nyrikki

2 days ago

Rice–Shapiro theorem.

The number of writers on a typical OS means that you can't depend on a pathological case.

I suppose you could reduce it to Rice's theorm and how threeTM is undecidable but remember it generalizes to even total functions.

Just goes back to the equivalence of two static programs requires running them, and there is too much entropy in file operations to practically cover much of the behavior.

When forced to, it can save you, but a block level copy on a live filesystem is opportunistic.

Crash consistency is obviously the best you can hope for here, so that and holes in classic NFS writes may be a more accessible lens on the non-happy path than my preferred computability one.

The guid being copied and no longer unique problem I mentioned below is where I have see people lose data the most.

The undecidable part is really a caution that it doesn't matter how smart the programmer is, there is simply no general algorithms that can remove the cost of losing the symantic meaning of those writers.

So it is not a good default strategy, only one to use when context forces it.

TL:DR it is Horses for courses