← Back to context

Comment by tehjoker

3 months ago

You already can't trust llm output. How will you trust it when its actively steering you based on profit motive?