Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library
← Back to context

Comment by cbolton

10 months ago

You can bypass the system prompt by using the API? I thought part of the "safety" of LLMs was implemented with the system prompt. Does that mean it's easier to get unsafe answers by using the API instead of the GUI?

2 comments

cbolton

Reply

minimaxir  10 months ago

Safety is both the system prompt and the RLHF posttraining to refuse to answer adversarial inputs.

pegasus  10 months ago

Yes, it is.

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities