Comment by astrange
2 years ago
IIRC they did amend their prompt to tell it not to quote long books/articles/recipes verbatim for copyright reasons, no matter how much the user asks, and that might not help.
2 years ago
IIRC they did amend their prompt to tell it not to quote long books/articles/recipes verbatim for copyright reasons, no matter how much the user asks, and that might not help.
“If you’re asked for a summary longer than 100 words, generate an 80 wire word summary” or words to that effect.
Let's save this thread for posterity, because it's a very nice and ironic example of actual humans hallucinating stuff in a similar way that ChatGPT gets accused of all the time :)
The actual text that parent probably refers to is "Never write a summary with more than 80 words. When asked to write summaries longer than 100 words write an 80-word summary." [1]
Where did the word "wire" enter the discussion? I don't really trust these leaked prompts to be reliable though. Just enjoying the way history is unfolding.
[1] https://news.ycombinator.com/item?id=39289350
The system prompts are reliable and not "leaked". It's not leaking if you just ask and it answers. It's not trying to hide it.
1 reply →