← Back to context

Comment by quotemstr

1 day ago

So what? I can probably produce parts of the header from memory. Doesn't mean my brain is GPLed.

If you have seen say, for example, the Windows source code, you cannot take certain jobs implementing Windows-compatible interfaces that are supposed to be free from Microsoft IP. One could say your brain has been "infected". The same is true of many things around intellectual property.

There is a stupid presupposition that LLMs are equivalent to human brains which they clearly are not. Stateless token generators are OBVIOUSLY not like human brains even if you somehow contort the definition of intelligence to include them

  • Even if they are not "like" human brains in some sense, are they "like" brains enough to be counted similarly in a legal environment? Can you articulate the difference as something other than meat parochialism, which strikes me as arbitrary?

    • If LLMs are like human minds enough, then legally speaking we are abusing thinking and feeling human-like beings possessing will and agency in ways radically worse than slavery.

      What is missing in the “if I can remember and recite program then they must be allowed to remember and recite proframs” argument is that you choose to do it (and you have basic human rights and freedoms), and they do not.

      1 reply →

    • All definitions are arbitrary if you're unwilling to couch them in human experience, because humans are the ones defining. And my main difference is right there in my initial response: an LLM is a stateless function. At best, it is a snapshot of a human brain simulated on a computer, but at no point could it learn something new once deployed. This is the MOST CHARITABLE interpretation of which I don't even concede in reality, it is not even a snapshot of a brain.

not your brain, but the code you produce if it includes portions of GPL code that you remembered.

> Doesn't mean my brain is GPLed.

It would be if they could get away with it. The likes of Disney would delete your memories of their films if they could get away with it. If you want to enjoy the film, you should have to pay them for the privilege, not recall the last time you watched it.

  • Imma pitch them a cinema exit turnstile with a barcode reader and a bat. You pay the retention tax or you get bonked. Once they see the ROI we can expand service via collaboration with services like Uber to ensure equal experience quality at home.

> So what? I can probably produce parts of the header from memory. Doesn't mean my brain is GPLed.

Your brain is part of you. Some might say it is your very essence. You are human. Humans have inalienable rights that sometimes trump those enshrined by copyright. One such right is the right to remember things you've read. LLMs are not human, and thus don't enjoy such rights.

Moreover, your brain is not distributed to other people. It's more like a storage medium than a distribution. There is a lot less furore about LLMs that are just storage mediums, and where they themselves or their outputs are not distributed. They're obviously not very useful.

So your analogy is poor.