Comment by KK7NIL
10 hours ago
> I strongly object to anthropomorphising text transformers (e.g. "Assisted-by").
I don't think this is anthropomorphising, especially considering they also include non-LLM tools in that "Assisted-by" section.
We're well past the Turing test now, whether these things are actually sentient or not is of no pragmatic importance if we can't distinguish their output from a sentient creature, especially when it comes to programming.
> We're well past the Turing test now
Nope, there is no “The” Turing Test. Go read his original paper before parroting pop sci nonsense.
The Turing test paper proposes an adversarial game to deduce if the interviewee is human. It’s extremely well thought out. Seriously, read it. Turing mentions that he’d wager something like 70% of unprepared humans wouldn’t be able to correctly discern in the near future. He never claims there to be a definitive test that establishes sentience.
Turing may have won that wager (impressive), but there are clear tells similar to the “how many the r’s are in strawberries?” that an informed interrogator could reliably exploit.
Would you say "assisted by vim" or "assisted by gcc"?
It should be either something like "(partially/completely) generated by" or if you want to include deterministic tools, then "Tools-used:".
The Turing test is an interesting thought experiment but we've seen it's easy for LLMs to sound human-like or make authoritative and convincing statements despite being completely wrong or full of nonsense. The Turing test is not a measure of intelligence, at least not an artificial one. (Though I find it quite amusing to think that the point at which a person chooses to refer to LLMs as intelligence is somewhat indicative of his own intelligence level.)
> whether these things are actually sentient or not is of no pragmatic importance if we can't distinguish their output from a sentient creature, especially when it comes to programming
It absolutely makes a difference: you can't own a human but you can own an LLM (or a corporation which is IMO equally wrong as owning a human).
Humans have needs which must be continually satisfied to remain alive. Humans also have a moral value (a positive one - at least for most of us) which dictates that being rendered unable to remain alive is wrong.
Now, what happens if LLMs have the same legal standing as humans and are thus able to participate in the economy in the same manner?
If a linter insists on a weird line of code, I’m probably commenting that line as “recommended by whatever-linter”, yes.
I wouldn't but I can see why some people would.
I can't point out where I draw the line clearly but here's one different I notice:
A recommendation can be both a thing and an action. A piece of text is a recommendation and it does not matter how it was created.
Assistance implies some parity in capabilities and cooperative work. Also it can pretty much only be an action, you cannot say "here is some assistance" and point to a thing.