Comment by ayoung5555

8 days ago

As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.

Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)

What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.

Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.

From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.

I spent several years trying to get ground truth out of digital medical records and I would draw this parallel to AI slop:

With traditional medical records, you could see what the practitioner did and covered because only that was in the record.

With computerized records, the intent, thought process, most signal you would use to validate internal consistency, was hidden behind a wall of boilerplate and formality that armored the record against scrutiny.

Bad writing on LinkedIn is self-evident. Everything about it stinks.

AI slop is like a Trojan Horse for weak, undeveloped thoughts. They look finished, so they sneak into your field of view and consume whatever additional attention is required to finally realize that despite the slick packaging, this too is trash.

So “AI slop,” in this worldview, is a complaint that historical signals of quality simply based on form, no longer are useful gatekeepers for attention.

  • re: traditional vs electronic medical records, if you haven't read Seeing Like a State, I highly recommend checking it out. The book is all about the unexpected side effects of improving the legibility of information for decision makers - these attempts can erase or elide important local detail, which ultimately sabotages the bureaucracy's aim of improving the system.

Did we lose something when we invented the calculator and stopped teaching the times table in schools? There have been millions of words discussing this, and the general consensus amongst us crusty old folks was that yes, the times table was useful and losing the ability to do mental arithmetic easily would be bad.

Turns out we were wrong. Everyone carries a calculator now on their phone, even me. Doing simple maths is a matter of moments on the calculator app, and it's rare that I find myself doing the mental arithmetic that used to be common.

I can't remember phone numbers any more. I used to have a good 50+ memorised, now I can barely remember my own. But the point is that I don't need to any more. We have machines for that.

Do we need to be able to write an essay? I have never written one outside of an educational context. And no, this post does not count as an essay.

I was expelled from two kindergartens as a kid. I was finally moved to a Montessori school where they taught individually by following our interests, where I thrived. Later, I moved back into a more conventional educational environment and I fucking hated every minute of it. I definitely learned despite my education not because if it. So if LLMs are about to completely disrupt education then I celebrate that. This is a good thing. Giving every kid a personal tutor that can follow their interests and teach them things that they actually want to learn, at the pace they want to learn them, is fucking awesome.

  • Any competent thinker should be able to structure an argument and present it in written form, that's an important skill to have.

    If someone is unable to write an essay arguing something, unable to articulate complex thoughts and back them up with evidence, what does that indicate about their thinking?

    I don't write essays either, but I'm sure I could. And maybe some of those docs or emails I write at work are made more effective by that.

    • There are literally hundreds of millions of people in the Anglosphere who have graduated from their education unable to coherently structure an argument and present it in written form.

      It indicates nothing about their thinking. One of the smartest people I've known left school at 14 and couldn't read or write.

      We mistake education for intelligence often. We mistake erudition for capability often. The thing you need to get a PhD is not intelligence, but the ability to follow directions and persevere. You certainly don't need to have any original thoughts, in fact they will only get in your way.

      5 replies →

  • Calculators are good. But we still teach times tables and long division and prohibit calculators until kids learn how to do it the “hard way.”

    We can’t give a generation of kindergarteners calculators and expect them to produce new math when they’re adults: how will they ever form mathematical problem solving skills?

    I think the same principle applies for LLMs - they can be a tool but learning how to do things without them is still essential. Otherwise we might not have any more good authors in 10 years.

    Before CAD, engineers had to draw designs on drafting boards. Similar concept here, I believe most classes still find it valuable for students to start with pencil and paper and grasp something at its most fundamental level, even if obsolete, before moving on to modern tools.

    LLMs (and calculators, and CAD) should be used as a tool once the underlying mechanisms and skills are understood by its user, otherwise it’s like driving a car without knowing how to replace a flat tire. Sure you can call AAA, but eventually if nobody learns to change a tire with their own two hands, humanity won’t be able to drive. This obviously hyperbole but I hope it illustrates my point.

    I’m fairly confident LLMs will be a net positive on society in the long run, just as calculators have been. But just like calculators are restricted at certain times in math classes, LLMs should be restricted in writing classes.

    • > We can’t give a generation of kindergarteners calculators and expect them to produce new math when they’re adults: how will they ever form mathematical problem solving skills?

      Arithmetic has nothing, literally nothing, to do with "new maths". A calculator won't help you with algebra, or shortcut any mathematical problem solving. It will just help you with dividing up the restaurant bill, which is the hardest maths problem the vast majority of humans will encounter.

      > I think the same principle applies for LLMs - they can be a tool but learning how to do things without them is still essential. Otherwise we might not have any more good authors in 10 years.

      Did we stop having any good new portrait painters once we'd invented the camera? The people who really want to write a book will still write a book.