Comment by Retr0id

2 days ago

I've been thinking about something along these lines, but coupled with deterministic inference. At each "macro" invocation you'd also include hash-of-model, and hash-of-generated-text. (Note, determinism doesn't require temperature 0, so long as you can control the rng seed. But there are a lot of other things that make determinism hard)

You could take it a step further and have a deterministic agent inside a deterministic VM, and you can share a whole project as {model hash, vm image hash, prompt, source tree hash} and have someone else deterministically reproduce it.

Is this useful? Not sure. One use case I had in mind as a mechanism for distributing "forbidden software". You can't distribute software that violates DMCA, for example, but can you distribute a prompt?

Deterministic inference is mechanically indistinguishable from decompression or decryption, so if there's a way to one-weird-trick DMCA, it's probably not this.

  • You’d think that, but it sees like big business and governments are treating inference as somehow special. I dunno, maybe low temperatures can highlight this weird situation?

    Temperature is an easy knob to twist, after all. Somebody (not me I’m too poor to pay the lawyers) should do a search and find where the crime starts.

    • Well, it's still not deterministic even at temp 0. The tech described in my comment's parent is speculative, and technically it's not even inference, once it's perfectly reproducible.

      At that point it's retrieving results from a database.

      EDIT: how would OP address my main point, which is that det. inference is functionally equivalent to any arbitrary keyed data storage/retrieval system?

      3 replies →