← Back to context

Comment by sdevonoes

17 hours ago

But why give Anthropic/openai our money? Nonsense. Use open models

Quality, simplicity, speed.

I have a ML Setup with 2 4090 and 128gb of ram, its warm when i use them for finetuning or batch processes.

I do not run them for coding. Its a lot easier and nicer to play around with better models for just 20 $.

The author got $50 free credits.

Also Anthropic is by far the best, open (local) models are glorified autocomplete at best unless you casually have 20k€ worth of hardware at home.

  • Disagree. Qwen 3.6 and opencode have built and helped plan entire feature sets such as vectorizing and searching, setting up UI to manage categorized search data. Some test systems around this, etc.

    Very usable locally assuming you setup your local tooling correctly and you are an actual programmer who can generally help drive this stuff correctly and not just a vibe coder.

    • How big of a Qwen model are you running that can plan and implement entire feature sets?

      I’ve tried multiple that I can run locally and they’re all very much just glorified autocomplete, but slower - on a M4 Max MacBook

  • Why assume local when you can easily use any of the open models via openrouter or any number of similar services.

    • The OP said “ But why give Anthropic/openai our money? Nonsense. Use open models”

      Then I’d be giving money to openrouter and a Chinese model provider, is that better?

      1 reply →