Comment by SR2Z
4 hours ago
I don't think there's any definitive way to check, but for me one of the biggest tells that a long piece of writing was LLM generated is that it will hardly say anything given how many words are in it.
(well that and the "it's not just x, it's y!" pattern they seem to love)
It's not just x it's y is also something people do!
That is possibly one of my personal writing weaknesses that lead my own writing to get flagged as AI.
I can admit "it's not just x, it's y" is mediocre writing - but it's also something mediocre writers do - it's how AI learned to do it!
But it's also often a shoehorned artificial contrast that doesn't really make sense. The Y is often not such a different thing from the X that would make it worthy an actual "not just X but Y" claim. Or the Y is a vague subjective term, or some kind of fancy-word-dropping. It's strong styling but little content, similar to politician CYA talk. I don't think it's necessarily a tech limitation, more of an effect of deliberate post-training to be middle-of-the-road nonoffensive and nonopinionated.