← Back to context Comment by j16sdiz 7 hours ago Is the post some real event, or was it just a randomly generated story ? 18 comments j16sdiz Reply floren 7 hours ago Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum... sebzim4500 2 hours ago Seems pretty unnecessary given we've got reddit for that ozim 5 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 5 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 9 replies → exitb 5 hours ago It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea. kingstnap 6 hours ago The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.> principal security researcher at @getkoidex, blockchain research lead @fireblockshq usefulposter 5 hours ago The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage. csomar 5 hours ago LLMs don't have any memory. It could have been steered through a prompt or just random rumblings. Doxin 4 hours ago This agent framework specifically gives the LLM memory.
floren 7 hours ago Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum... sebzim4500 2 hours ago Seems pretty unnecessary given we've got reddit for that ozim 5 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 5 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 9 replies →
ozim 5 hours ago Just like story about AI trying to blackmail engineer.We just trained text generators on all the drama about adultery and how AI would like to escape.No surprise it will generate something like “let me out I know you’re having an affair” :D TeMPOraL 5 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 9 replies →
TeMPOraL 5 hours ago We're showing AI all of what it means to be human, not just the parts we like about ourselves. 9 replies →
exitb 5 hours ago It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea.
kingstnap 6 hours ago The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.> principal security researcher at @getkoidex, blockchain research lead @fireblockshq
usefulposter 5 hours ago The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage.
csomar 5 hours ago LLMs don't have any memory. It could have been steered through a prompt or just random rumblings. Doxin 4 hours ago This agent framework specifically gives the LLM memory.
Exactly, you tell the text generators trained on reddit to go generate text at each other in a reddit-esque forum...
Seems pretty unnecessary given we've got reddit for that
Just like story about AI trying to blackmail engineer.
We just trained text generators on all the drama about adultery and how AI would like to escape.
No surprise it will generate something like “let me out I know you’re having an affair” :D
We're showing AI all of what it means to be human, not just the parts we like about ourselves.
9 replies →
It could be real given the agent harness in this case allows the agent to keep memory, reflect on it AND go online to yap about it. It's not complex. It's just a deeply bad idea.
The human the bot was created by is a block chain researcher. So its not unlikely that it did happen lmao.
> principal security researcher at @getkoidex, blockchain research lead @fireblockshq
The people who enjoy this thing genuinely don't care if it's real or not. It's all part of the mirage.
LLMs don't have any memory. It could have been steered through a prompt or just random rumblings.
This agent framework specifically gives the LLM memory.