← Back to context

Comment by add-sub-mul-div

1 day ago

1. Your chatbot doesn't have its own internet scale search index.

2. You're being given information that may or may not be coming in part from junk sites. All you've done is give up the agency to look at sources and decide for yourself which ones are legitimate.

As for point one, is that true? I thought ChatGPT and Perplexity had their own indexes.

I’m quite happy trading off the agency of wading through trash to an LLM. In fact, I would say that’s something they’re pretty good at.

  • > look at sources and decide ... which ones are legitimate

    > I would say that’s something they’re pretty good at.

    Lol. Lmao, even.

    Seriously, LLMs are famously terrible at this. It's the entire problem behind prompt injection.

    https://en.wikipedia.org/wiki/Prompt_injection

    They're really good at... ingesting the trash. Yeah, that's pretty much their whole purpose. But understanding it as trash? Not even close. LLMs don't have taste. As another commenter wrote, it's just regurgitating it back.