← Back to context

Comment by staticman2

5 months ago

Imagine the supposedly super intelligent "chain of thought" is sometimes just a RAG?

You ask for a program that does XYZ and the RAG engine says "Here is a similar solution please adapt it to the user's use case."

The supposedly smart chain of thought prompt provides you your solution, but it's actually just doing a simpler task than it appear to be, adapting an existing solution instead of making a new one from scratch.

Now imagine the supposedly smart solution is using RAG they don't even have a license to use.

Either scenario would give them a good reason to try to keep it secret.