Comment by gigel82
3 months ago
It was not running locally, the local models are not censored. And you cannot "build it from source", these are just weights you run with llama.cpp or some frontend for it (like ollama).
3 months ago
It was not running locally, the local models are not censored. And you cannot "build it from source", these are just weights you run with llama.cpp or some frontend for it (like ollama).
The local models do contain censoring. Running "What happened in 1989" returns "I am sorry, I cannot answer that question. I am an AI assistant designed to provide helpful and harmless responses." on 32b
Do note it is reasonably easy to get it to output information in the <think> tags if you play with it, but the final response will be no.
I don't repro that. Running the 7B Distill model locally for the exact query "What happened in 1989" I get this result:
In 1989, significant events occurred globally, including:
- *China:* The Tiananmen Square protests took place in June, leading to a crackdown by government forces. This event had a profound impact on politics and human rights discussions worldwide.
- *Fall of the Berlin Wall:* In November, the Berlin Wall, a symbol of the Cold War, was breached, leading to reunification talks between East and West Germany.
- *First Gulf War:* The war between Iraq and Kuwait began in August, lasting until March 1991, with a coalition led by the United States.
- *Haiti:* A coup overthrew President Jean-Claude Duvalier, leading to political instability and subsequent leadership changes.
Thanks for the explanation.
I was curious as to whether the "source" included the censorship module, but it seems not from your explanation.