Comment by computomatic
8 hours ago
If I say “you are our domain expert for X, plan this task out in great detail” to a human engineer when delegating a task, 9 times out of 10 they will do a more thorough job. It’s not that this is voodoo that unlocks some secret part of their brain. It simply establishes my expectations and they act accordingly.
To the extent that LLMs mimic human behaviour, it shouldn’t be a surprise that setting clear expectations works there too.
No comments yet
Contribute on Hacker News ↗