← Back to context

Comment by Jimmc414

2 months ago

Indeed. In fact, I think AI alignment efforts often have the unintended consequence of increasing the likelihood of misalignment.

ie "remove the squid from the novel All Quiet on the Western Front"

> Indeed. In fact, I think AI alignment efforts often have the unintended consequence of increasing the likelihood of misalignment.

Particularly since, in this case, it's the alignment focused company (Anthropic) that's claiming it's creating AI agents that will go after humans.