Comment by eikenberry
18 hours ago
Why assume local when you can easily use any of the open models via openrouter or any number of similar services.
18 hours ago
Why assume local when you can easily use any of the open models via openrouter or any number of similar services.
The OP said “ But why give Anthropic/openai our money? Nonsense. Use open models”
Then I’d be giving money to openrouter and a Chinese model provider, is that better?
Yes, it is better. They are releasing open models, unlike Anthropic. Additionally other (non-chinese) companies run the open models, so if that is the issue you have options.
"If you don't pay for a product, you are the product". That's always been emphasized on HN threads related to FB, Google, etc.
Are LLMs different?