Comment by ebbi
14 hours ago
The 'multi-model consensus' feature actually looks very useful! I'm going to give this a go.
A question on OpenRouter - is it just a place to consolidate the various AI models through one billing platform, or does it do more than that? And are the costs slightly more as they take a cut in between?
> is it just a place to consolidate the various AI models through one billing platform, or does it do more than that
You can easily switch models, use the cheapest provider (especially for open models), and not have to reach certain "tiers" to get access to limits like you might on OpenAI/Anthropic's direct offerings.
> And are the costs slightly more as they take a cut in between?
5% more, you buy credits upfront and pay 5% extra. Aside from that you pay the normal prices listed (which have always matched the direct providers as well AFAIK).
Note that you also might need to think a little bit about caching: https://openrouter.ai/docs/guides/best-practices/prompt-cach...
Depending on the way how the context grows, it can matter quite a bit!
Great call out! Yes I have tried to follow these, to make Consensus compliant with OpenRouter's prompt caching best practices.
Appreciate the reply mate, thank you.
What's great about OpenRouter is you have access to all providers and models and they do the work of standardizing the interface. Our new HiveTechs Consensus IDE configures 8 profiles for you and your AI conversations, each using its own LLM from OpenRouter and unlimited custom profiles, you pick the providers and LLM's from a list and name the profile. Also, we have our own built in HiveTechs CLI that gives you the ability to use any LLM from OpenRouter, updated daily. So the moment a new model drops, you can test it out without waiting for it to release in your other favorite apps.