Comment by Terretta
21 days ago
> If you are going to use human brain cells to memorize protected content and sell it as a product, that's still an issue based on current copyright laws.
And yet, that's all most billable hours at McKinsey, BCG, KPMG, are for. Those consultants memorized copyrighted stuff so your executives didn't have to.
It's very difficult to explain how GPT is not consulting.
The question there still comes down to what they did with the memorized content. There's nothing wrong with memorizing copyrighted content, there's legally a problem with trying to resell it without paying royalties under contract to the owner of the copyright.
The problem with LLMs is techbro crowd trying to pretend they are like thinking humans (all these analogies with us memorizing things) when it comes to rights like access to information and copyright abuse, but not thinking humans when it comes to using LLMs themselves.
You’d think any logical person would believe only one or the other is true, but big tech got many people believing this paradox because the industry depends on it. As soon as the paradox is over, the industry is revealed to be either based on IP theft or on slavery.
There is no problem to argue about, really: laws and basic rights and freedoms exist for humans; if %thing% (be it made of chips or brain cells) is not considered human then laws and rights apply to humans who operate it; if the thing is considered human then it itself has human rights to be reckoned with.