← Back to context

Comment by TeMPOraL

3 days ago

Once again affirming that prompt injection is social engineering for LLMs. To a first approximation, humans and LLMs have the same failure modes, and at system design level, they belong to the same class. I.e. LLMs are little people on a chip; don't put one where you wouldn't put the other.

They are worse than people: LLM combine toddler level critical thinking with intern level technical skills, and read much much faster than any person can.

  • Right. But my point is, they belong to the bucket labeled "people", not the one labeled "software", for purpose of system design.