← Back to context

Comment by jug

1 day ago

This sure took some time and is not really a unique feature.

Microsoft Copilot has ended chats going in certain directions since its inception over a year ago. This was Microsoft’s reaction to the media circus some time ago when it leaked its system prompt and declared love to the users etc.

That's different, it's an external system deciding the chat is not-compliant, not the model itself.