Comment by fullstackchris

2 years ago

Ugh I have to finish my article about these "system prompts"... basically if one does any amount of second order thinking, one realizes that it's impossible to prove that these are real are not, much like the simulation theory...

Do people realize that reproducibility is in no way a signal of factuality? Especially if it's coming from these LLMs.

And have we had ONE, just ONE person from OpenAI / Mistral / Microsoft ever "leak" one of these so called "system prompts"? No. It's mass delusion at it's finest, but I suppose it is entertaining to watch the discussions that resurface approximately every few months.