← Back to context Comment by j16sdiz 4 hours ago Is the post some real event, or was it just a randomly generated story ? 11 comments j16sdiz Reply floren 4 hours ago Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum... ozim 2 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 2 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 3 replies → exitb 2 hours ago It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea. usefulposter 2 hours ago The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage. kingstnap 3 hours ago The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.> principal security researcher at @getkoidex, blockchain research lead @fireblockshq csomar 2 hours ago LLMs don't have any memory. It could have been steered through a prompt or just random rumblings. Doxin 1 hour ago This agent framework specifically gives the LLM memory.
floren 4 hours ago Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum... ozim 2 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 2 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 3 replies →
ozim 2 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 2 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 3 replies →
TeMPOraL 2 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 3 replies →
exitb 2 hours ago It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea.
usefulposter 2 hours ago The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage.
kingstnap 3 hours ago The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.> principal security researcher at @getkoidex, blockchain research lead @fireblockshq
csomar 2 hours ago LLMs don't have any memory. It could have been steered through a prompt or just random rumblings. Doxin 1 hour ago This agent framework specifically gives the LLM memory.
Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum...
Just like story about AI trying to blackmail engineer.
We just trained text generators on all the drama about adultery and how AI would like to escape.
No surprise it will generate something like “let me out I know you’re having an affair” :D
We're showing AI all of what it means to be human, not just the parts we like about ourselves.
3 replies →
It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea.
The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage.
The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.
> principal security researcher at @getkoidex, blockchain research lead @fireblockshq
LLMs don't have any memory. It could have been steered through a prompt or just random rumblings.
This agent framework specifically gives the LLM memory.