Comment by no-name-here

5 hours ago

I'm not agreeing with the OP proposal, but with LLMs today, no matter how you license your code and no matter what ToS or other prohibition you put on it, there does not seem to be any way to prevent LLMs from absorbing and using it to implement a replacement based on your code unless you choose to only do closed source code - there's no "opt out" for someone's source code, let alone an opt-in (again, unless we give up open source). (A very different situation from the AI companies themselves, where companies such as Anthropic make Claude Code closed source, and their ToS provide strict prohibitions on using it to work on something that could compete with them - can you imagine if Windows or MacOS's ToS prohibited people from using their OS to work on a competing OS, of if the VSCode ToS prohibited people from using VSCode to work on another editor?)