Comment by bstrama

18 hours ago

UPDATE: Just now, comment section added. Have a nice time arguing!

I'm curious, what is the LLM cost of the website?

  • I’m curious, too. But it could probably run locally with a small model, right? The performance is stellar, so that suggests some hardware acceleration is being used, but that could all be a local system.