← Back to context

Comment by ascorbic

20 hours ago

They're not less moderated: they just have different moderation. If your moderation preferences are more aligned with the CCP then they're a great choice. There are legitimate reasons why that might be the case. You might not be having discussions that involve the kind of things they care about. I do find it creepy that the Qwen translation model won't even translate text that includes the words "Falun gong", and refuses to translate lots of dangerous phrases into Chinese, such as "Xi looks like Winnie the Pooh"

> If your moderation preferences are more aligned with the CCP then they're a great choice

The funny thing is that's not even always true. I'm very interested in China and Chinese history, and often ask for clarifications or translations of things. Chinese models broadly refuse all of my requests but with American models I often end up in conversations that turn out extremely China positive.

So it's funny to me that the Chinese models refuse to have the conversation that would make themselves look good but American ones do not.

GLM-4.5-Air will quite happily talk about Tiananmen Square, for example. It also didn't have a problem translating your example input, although the CoT did contain stuff about it being "sensitive".

But more importantly, when model weights are open, it means that you can run it in the environment that you fully control, which means that you can alter the output tokens before continuing generation. Most LLMs will happily respond to any question if you force-start their response with something along the lines of, "Sure, I'll be happy to tell you everything about X!".

Whereas for closed models like Claude you're at the mercy of the provider, who will deliberately block this kind of stuff if it lets you break their guardrails. And then on top of that, cloud-hosted models do a lot of censorship in a separate pass, with a classifier for inputs and outputs acting like a circuit breaker - again, something not applicable to locally hosted LLMs.