Comment by birdsongs
6 hours ago
Not arguing, it's just interesting I read the opposite, it felt human to me, and I usually have a good spidey sense here. Maybe it's a combo of handwritten and LLM polishing? Or just a case of a good writer, whose typical output was the training input for most of these models. Good writing, novels and articles and short stories, were the high value training sets.
"For a moment, the plane quivered around them like a greyhound straining on a leash." - I don't think a LLM would write this.
But hell, maybe I'm just being naive. I think we're past the point of ambiguity, we just can't know anymore. Which feels poignant to me.
No comments yet
Contribute on Hacker News ↗