Comment by senordevnyc
5 days ago
As I recall a team at Anthropic is exploring this very question, and was soundly mocked here on HN for it.
5 days ago
As I recall a team at Anthropic is exploring this very question, and was soundly mocked here on HN for it.
what the technocratic mindprison does to a MF.
If anthropic sincerely believes in the possibility, then they are morally obligated to follow up on it.
I'd argue they might be morally obligated not to sell access to their LLMs, if they really think they might be capable of suffering.