← Back to context

Comment by 3578987532688

2 days ago

My tip: Move away from Google to an LLM that doesn't respond with "There was a problem getting a response" 90% of the time.

Are we getting billed for these? The billing is so very not transparent.

  • Would be nice to have an official confirmation. Once token get back to the user those are likely already counted.

    Sucks when the LLM goes on a rant only to stop because of hardcoded safeguards, or what I encounter often enough with Copilot: it generates some code, notices it's part of existing public code and cancels the entire response. But that still counts towards my usage.

I had a terrible first impression with Gemini CLI a few months ago when it was released because of the constant 409 errors.

With Gemini 3 release I decided to give it another go, and now the error changed to: "You've reached the daily limit with this model", even though I have an API key with billing set up. It wouldn't let me even try Gemini 3 and even after switching to Gemini 2.5 it would still throw this error after a few messages.

Google might have the best LLMs, but its agentic coding experience leaves a lot to be desired.

  • I had to make a new API key. My old one got stuck with this error; it's on Google's end. New key resolved immediately.

    • and then loosing half a day setting up billing - with a limited virtual credit card so you have at least some cost control

      1 reply →