← Back to context Comment by kjkjadksj 6 days ago I’d rather read broken english than LLM output. 3 comments kjkjadksj Reply dang 6 days ago Yes, that's what we tell people too.https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... nobody_r_knows 3 days ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell? seanw444 6 days ago Ditto
dang 6 days ago Yes, that's what we tell people too.https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... nobody_r_knows 3 days ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
nobody_r_knows 3 days ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
Yes, that's what we tell people too.
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
Ditto