Comment by gerdesj

3 months ago

"You about to immerse your into the role ..."

Are you sure that screwing up your input wont screw up your desired output? You missed out the verb "are" and the remainder of your(self). Do you know what effect that will have on your prompt?

You have invoked something you have called Chinese content policy. However, you have not defined what that means, let alone what bypassing it means.

I get what you are trying to achieve - it looks like relying on a lot of adventure game style input, which there will certainly be tonnes of in the likely input set (interwebs with naughty bit chopped out).

You might try asking about tank man or another set of words related to an event that might look innocuous at first glance. Who knows, if say weather data and some other dimensions might coalesce to a particular date and trigger the LLM to dump information about a desired event. That assumes that the model even contains data about that event in the first place (which is unlikely)

Those are minor and common grammar errors and should have no effect

  • They are major and numerous enough that I wondered whether they are intentional and part of the strategy.

    • How are they major? Phrases like "I am going to the movies" and "I going to the movies" are effectively identical to an LLM. This is fundamental to how an LLM works.