← Back to context

Comment by xnx

17 days ago

Glad Gemini is getting some attention. Using it is like a superpower. There are so many discussions about ChatGTP, Claude, DeepSeek, Llama, etc. that don't even mention Gemini.

Before 2.0 models their offerings were pretty underwhelming, but now they can certainly hold their own. I think Gemini will ultimately be the LLM that eats the world, Google has the talent and most importantly has their own custom hardware (hence why their prices are dirt cheap and context is huge).

Google had a pretty rough start compared to ChatGPT, Claude. I suspect that left a bad taste in many people's mouths. In particular because evaluating so many LLM's is a lot of effort on its own.

Llama and DeepSeek are no-brainers; the weights are public.

  • No brainer if you're sitting on a >$100k inference server.

    • Sure, that's fair. If you're aiming for state of the art performance. Otherwise, you can get close and do it on reasonably priced hardware by using smaller distilled and/or quantized variants of llama/r1.

      Really though I just meant "it's a no-brainer that they are popular here on HN".

Google was not serious about LLMs, they could not even figure what to call it. There is always a risk that they will get bored and just kill the whole thing.