Comment by willio58

3 years ago

> the problem is that it has already been proven that it is possible to write good sounding nonsense and have that published as scientific research.

This seems to be an issue with how science publishing works, how things are 'peer reviewed', etc. If things were _truly_ peer reviewed, would the peers not catch nonsense in papers? And if it was not caught by a true review, then maybe it's not nonsense after all.

> an AI generated text can't possibly be useful because it is inherently untrustworthy

Probably 25% of the code I 'write' these days has been written by AI, through Copilot. Copilot isn't perfect but it can create a basis to start from that saves me lots of time. This is how I view these content writing tools, something to get the ball rolling, but not something you would use to generate all the content on your site without editing.