← Back to context

Comment by IncreasePosts

1 year ago

Why doesn't the model prompt engineer itself?

Because it is a challenging task, you would need to define a prompt (or a set of prompts) that can precisely generate chain-of-thought prompts for the various generic problems the model encounters.

And sometimes CoT may not be the best approach. Depending on the problem other prompt engineering techniques will perform better.