Comment by whatshisface
4 hours ago
Instead of collecting biometric info from humans and IDing all of their online movements, you could mandate that LLM output be watermarked (using a technology that Scott Arronson was hired by OpenAI to help develop, after which the project was shut down under Altman right after proving that it could work) so that their online movement would be IDed. The implication in this story that it was shut down to keep the Orb around in principle (telling humans they had to be tagged to distinguish them from machines that could more easily be tagged) is very easy to pick up.
Is that viable given the proliferation of open-weight LLMs that don't apply that sort of watermarking? If somebody with malign intent can skip the attestation, presumably they will, right?
If every major LLM producer did it, it would not be easy for malicious actors to remove it. That's on top of the fact that most of the issue is coming from careless actors.