Comment by andai
12 hours ago
Thanks. I call this method Power Coding (like Power Armor), where you're still doing everything except for typing out the syntax.
I found that for this method the smaller the model, the better it works, because smaller models can generally handle it, and you benefit more from iteration speed than anything else.
I don't have hardware to run even tiny LLMs at anything approaching interactive speeds, so I use APIs. The one I ended up with was Grok 4 Fast, because it's weirdly fast.
ArtificialAnalysis has a section "end to end" time, and it was the best there for a long time, tho many other models are catching up now.
No comments yet
Contribute on Hacker News ↗