← Back to context

Comment by throwaway173738

3 days ago

I can’t count the number of times the docs have been totally wrong.

And they were actually like that pre-LLM, in 2019, when I was implementing stuff for a car company on azure. They spent _hundreds of thousands_ on cosmosDB, for less performance than a raspberry pi running Postgres.

  • Every marketing page and just about every second documentation page goes on and on about how fantastic CosmosDB performance and scalability is. Meanwhile, the best performance I have ever managed to squeeze out of it could be generously classified as "glacial".

    Whenever I read its docs I feel like I'm being gaslit.

Pretty surprised to hear this. I would think (assuming they are LLM written as parent suggests), that MS could throw a large context "pro" LLM at the code base and you should get perfect docs, updated every release?

More perfect than a person where I might mistakenly copy/paste or write "Returns 404" but the LLM can probably see actually return a 401.

I'm not a stranger to LLMs hallucinating things in responses but I'd always assumed that disappeared when you actually pointed it at the source vs some nebulous collection of "knowledge" in a general LLM.

  • Is it your first time using an LLM? No, they generate plausible-sounding bullshit no matter the input. Sometimes that bullshit is useful. Other times it isn't.