← Back to context

Comment by recursive

13 hours ago

This only works as "proof" up until someone innovates an "authenticity" flag on the LLM output.

tbh u can basically do this now lol... no flag needed.

if u want it to sound more real u just gotta tell the bot to write that way. like literally just ask it to throw in some typos or forget to capitalize stuff. or use slang and kinda ramble instead of being all robotic and organized.