Comment by HelloUsername

2 months ago

Lumo is powered by open-source large language models (LLMs) which have been optimized by Proton to give you the best answer based on the model most capable of dealing with your request. The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3. These run exclusively on servers Proton controls so your data is never stored on a third-party platform. Lumo’s code is open source, meaning anyone can see it’s secure and does what it claims to. We’re constantly improving Lumo with the latest models that give the best user experience.

Running those small models is usually not a problem for SME or homelabs. Serving full Kimi K2, Qwen3 or Deepseek V3/R1 under the Proton conditions would be an interesting offer.

I wonder how is this different from Apple's approach (Private Cloud Compute).

Which means the performance will be noticeably worse than any of the mainstream models.

"The responses are worse, but don't worry, at least the queries are private!" says nobody.

  • It’s funny how when it’s Apple, everyone is happy to defend even the most incomprehensible decisions with “privacy as a feature”. For everyone else apparently privacy doesn’t count. I think “Donald Trump can’t get your photos” is a pretty good selling point.

    • > everyone is happy to defend even the most incomprehensible decisions with “privacy as a feature”

      Not me. I care about privacy and I know they care about privacy, but what I want to see is that they have a product in the first place before all those other things.

      In fact, I more or less knew Apple wouldn't ship a good product when all they talked about was privacy instead of providing any meaningful data about performance. Turns out it's all just vaporware.

So is this aimed at small models only? Is there any advantages to these models compared to what I can run locally on a 16GB VRAM GPU?

Would be nice for something at the level of like Claude 3.5

Does anyone knows if there is a hoster for Kimi K2, Qwen3 or Deepseek V3/R1 or so in the EU?