← Back to context

Comment by Cthulhu_

14 days ago

Thing is, you know it, but for (randomly imagined number) 95% of people, it's convincing enough to be conscious or whatnot. And a lot of the ones that do know this gaslight themselves because it's still useful or profitable to them, or they want to believe.

The ones that are super convinced they know exactly how an LLM works, but still give it prompts to become self-aware are probably the most dangerous ones. They're convinced they can "break the programming".

> give it prompts to become self aware

You need to give it more than prompts. You need to give it the ability to reflect on itself (which it has), and persistent memory to modify its representation of itself (which, for example, Cursor does), at least.