← Back to context

Comment by ffsm8

1 day ago

But with open router you can always just use the latest model. If you're committed to eg Claude opus then you're better off going directly to anthropic for sure, but if not, varying other models may be fine too, depending on use case and be massively cheaper. Eg new deep seek model with same mio context window or Kimi k2.6 with 270k context window for subagents which implement

>but if not, varying other models may be fine too, depending on use case and be massively cheaper

Do inference providers have standardized endpoints, or at least endpoints compatible with claude code? Otherwise to pay 5.5% on all your tokens just so it's slightly easier to swap providers (ie. changing a few urls?)

  • > Do inference providers have standardized endpoints, or at least endpoints compatible with claude code?

    Yep, you can plug deepseek/kimi/minimax into claude code just fine. Or run everything through another harness like opencode instead.