Comment by zamadatix
11 hours ago
Is there a chance you could ask Ryan if he had an LLM write/rewrite large parts of this blog post? I don't mind at all if he did or didn't in itself, it's a good and informative post, but I strongly assumed the same while reading the article and if it's truly not LLM writing then it would serve as a super useful indicator about how often I'm wrongly making that assumption.
There are multiple signs of LLM-speak:
> Over the past year, we’ve seen a shift in what Deno Deploy customers are building: platforms where users generate code with LLMs and that code runs immediately without review
This isn't a canonical use of a colon (and the dependent clause isn't even grammatical)!
> This isn’t the traditional “run untrusted plugins” problem. It’s deeper: LLM-generated code, calling external APIs with real credentials, without human review.
Another colon-offset dependent paired with the classic, "This isn't X. It's Y," that we've all grown to recognize.
> Sandboxing the compute isn’t enough. You need to control network egress and protect secrets from exfiltration.
More of the latter—this sort of thing was quite rare outside of a specific rhetorical goal of getting your reader excited about what's to come. LLMs (mis)use it everywhere.
> Deno Sandbox provides both. And when the code is ready, you can deploy it directly to Deno Deploy without rebuilding.
Good writers vary sentence length, but it's also a rhetorical strategy that LLMs use indiscriminately with no dramatic goal or tension to relieve.
'And' at the beginning of sentences is another LLM-tell.
Can it be that after reading so many LLM texts we will just subconciously follow the style, because that's what we are used to? No idea how this works for native English speakers, but I know that I lack my own writing style and it is just a pseudo-llm mix of Reddit/irc/technical documentation, as those were the places where I learned written English
Yes, I think you're right—I have a hard time imagining how we avoid such an outcome. If it matters to you, my suggestion is to read as widely as you're able to. That way you can at least recognize which constructions are more/less associated with an LLM.
When I was first working toward this, I found the LA Review of Books and the London Review of Books to be helpful examples of longform, erudite writing. (edit - also recommend the old standards of The New Yorker and The Atlantic; I just wanted to highlight options with free articles).
I also recommend reading George Orwell's essay Politics and the English Language.
> It’s deeper: LLM-generated code, calling external APIs with real credentials, without human review.
This also follows the rule of 3s, which LLMs love, there ya go.
Yeah, I feel like this is really the smoking gun. Because it's not actually deeper? An LLM running untrusted code is not some additional level of security violation above a plugin running untrusted code. I feel like the most annoying part of "It's not X, it's Y" is that agents often say "It's not X, it's (slightly rephrased X)", lol, but it takes like 30 seconds to work that out.
1 reply →
It's unfortunate that, given the entire corpus of human writing, LLMs have seemingly been fine-tuned to reproduce terrible ad copy from old editions of National Geographic.
(Yes, I split the infinitive there, but I hate that rule.)
As someone that has a habit of maybe overusing em dashes to my detriment, often times, and just something that I try to be mindful of in general. This whole thing of assuming that it's AI generated now is a huge blow. It feels like a personal attack.
It’s about more than the emdash. The LLM writing falls into very specific repeated patterns that become extremely obvious tells. The first few paragraphs of this blog post could be used in a textbook as it exhibits most of them at once.
"—" has always seemed like an particularly weak/unreliable signal to me, if it makes you feel any better. Triply so in any content one would expect smart quotes or formatted lists, but even in general.
RIP anyone who had a penchant for "not just x, but y" though. It's not even a go-to wording for me and I feel the need to rewrite it any time I type it out of fear it'll sound like LLMs.
> RIP anyone who had a penchant for "not just x, but y" though
I felt that. They didn’t just kidnap my boy; they massacred him.
[dead]