Comment by kjkjadksj 2 months ago I’d rather read broken english than LLM output. 3 comments kjkjadksj Reply dang 2 months ago Yes, that's what we tell people too.https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... nobody_r_knows 2 months ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell? seanw444 2 months ago Ditto
dang 2 months ago Yes, that's what we tell people too.https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... nobody_r_knows 2 months ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
nobody_r_knows 2 months ago You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
Yes, that's what we tell people too.
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
You can't even tell, bro. This comment was generated by AI. Can you tell? Hardly. You're like a bouncer telling people they can't wear warm underwear. How on earth will you possible know, care, or tell?
Ditto