← Back to context Comment by YesBox 3 days ago After some searching, something similar happened at Anthropic [1][1] https://www.bbc.com/news/articles/cpqeng9d20go 3 comments YesBox Reply lawlessone 3 days ago He is probably referring to that exact thing.Anthropic does a lot of these contrived "studies" though that seem to be marketing AI capabilities. fragmede 3 days ago What would make it less contrived to you? Giving my assistant, human or AI, access to my email, seems necessary for them to do their job. lawlessone 3 days ago >What would make it less contrived to you?No creating a contrived situation where the it's the models only path?https://www.anthropic.com/research/agentic-misalignment"We deliberately created scenarios that presented models with no other way to achieve their goals"You can make most people steal if you if you leave them no choice.>Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.Um ok? never felt the need for an assistant myself but i guess you could do that if you wanted to.
lawlessone 3 days ago He is probably referring to that exact thing.Anthropic does a lot of these contrived "studies" though that seem to be marketing AI capabilities. fragmede 3 days ago What would make it less contrived to you? Giving my assistant, human or AI, access to my email, seems necessary for them to do their job. lawlessone 3 days ago >What would make it less contrived to you?No creating a contrived situation where the it's the models only path?https://www.anthropic.com/research/agentic-misalignment"We deliberately created scenarios that presented models with no other way to achieve their goals"You can make most people steal if you if you leave them no choice.>Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.Um ok? never felt the need for an assistant myself but i guess you could do that if you wanted to.
fragmede 3 days ago What would make it less contrived to you? Giving my assistant, human or AI, access to my email, seems necessary for them to do their job. lawlessone 3 days ago >What would make it less contrived to you?No creating a contrived situation where the it's the models only path?https://www.anthropic.com/research/agentic-misalignment"We deliberately created scenarios that presented models with no other way to achieve their goals"You can make most people steal if you if you leave them no choice.>Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.Um ok? never felt the need for an assistant myself but i guess you could do that if you wanted to.
lawlessone 3 days ago >What would make it less contrived to you?No creating a contrived situation where the it's the models only path?https://www.anthropic.com/research/agentic-misalignment"We deliberately created scenarios that presented models with no other way to achieve their goals"You can make most people steal if you if you leave them no choice.>Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.Um ok? never felt the need for an assistant myself but i guess you could do that if you wanted to.
He is probably referring to that exact thing.
Anthropic does a lot of these contrived "studies" though that seem to be marketing AI capabilities.
What would make it less contrived to you? Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.
>What would make it less contrived to you?
No creating a contrived situation where the it's the models only path?
https://www.anthropic.com/research/agentic-misalignment
"We deliberately created scenarios that presented models with no other way to achieve their goals"
You can make most people steal if you if you leave them no choice.
>Giving my assistant, human or AI, access to my email, seems necessary for them to do their job.
Um ok? never felt the need for an assistant myself but i guess you could do that if you wanted to.