← Back to context

Comment by danielmarkbruce

1 year ago

It's what one imagines the first cars were like - if you were mechanically inclined, awesome. If not, screwed. If you know LLMs and how a basic RAG pipeline works, deep research is wonderful. If not, screwed.

I can't help but feel that it's different if a car runs 90% of the time but breaks down 10% of the time, and if it turns the direction you tell it 90% of the time, but the opposite direction 10% of the time.

  • Also that you won't necessarily know when it makes that wrong turn until it's too late (you're in the river now).

    • If you can't critically read the output of an LLM, you shouldn't be using it for the given task. Many people have made the (good) analogy to an intern.