← Back to context

Comment by aakresearch

11 hours ago

>>At the same time, do you really want every conversation you have with your doctor recorded

>Yes. This is what medical records are.

No. Medical records are limited extracts from conversations, which is your doctor and only your doctor is qualified to make, using "semantic analysis applied to your unique situation", not "linguistic probabilistic inference applied to conversation about your situation using token weights averaged over billion unrelated samples"

> It's not like the doctor is talking to you about which anime series are the best. You're talking about your health, your body, your disease, your treatment.

No jokes, no banter, no chit-chat, no complements to doctor's new Tesla?

> It's important to keep track of that.

Same fallacy Meta fell into when started tracking employees' keystrokes and mouse gestures. 90% of my mouse movements are just fidgeting, with no relation to the task at hand - and it is not a crime! But if I knew my mouse fidgeting is being watched, I'll make sure that percentage goes up to 99% - for the LLM which is gonna be trained off it to self-immolate over its NSFW nature.

Hey, I'm agreement with you.

I meant that these limited extracts do need to be recorded, that's all.

Read the rest of the comment :)

  • Oops... I am deeply sorry, thank you for the heads up! It seems I've myself committed a cardinal sin that I am usually quick to point in others - rushing to reply without comprehending the full message. (Meta-oops: I realized how LLM-ish it sounds. Quick, reboot before my cover is blown!)

    I happen to believe that the flaw being discussed IS fundamental and inherent in the design and architecture of LLM - this is why I always put "AI" in scare quotes. I've spoken about it in some of my other comments, namely this https://news.ycombinator.com/item?id=48046333. And as you do, I, too, hope that I am wrong about the hype and its eventual clash with reality, but do not hold my breath.