Comment by FitchApps
3 days ago
Try WebLLM - it's pretty decent and all in-browser/offline even for light tasks, 1B-1.5B models like Qwen2.5-Coder-1.5B-Instruct. I put together a quick prototype - CodexLocal.com but you can essentially a local nginx and use webllm as an offline app. Of course, you can just use Ollama / LM Studio but that would require a more technical solution
No comments yet
Contribute on Hacker News ↗