Comment by ozozozd
25 days ago
I realized that running one’s own writing through an LLM reduces the amount of information in it. Sort of like washing the nutrients of a fruit.
When we write about something, inevitably, things about us leak into our writing. How we think about this thing, our value judgments about it, how much we thought about it, whether our perspective and thoughts on it are aged or fresh all come through, even if we don’t intend to. All of this information builds trust, helps the reader empathize and see our point of view.
When our writing passes through an LLM, most of these are simply lost. An average expression of those thoughts with all the sharp edges - its character, essence - removed comes out.
All writing is opinionated, and when it runs through an LLM, it comes out opinion-less. I noticed that I don’t care for opinion-less writing. Or people.
One exception is the official Python documentation. I recently read some of the new documentation, and realized that it reads almost exactly as I first read it in 2010. I couldn’t believe it. Low opinion, high information density. I know for a fact that it has opinions in parts, but it’s shockingly infrequent.
No comments yet
Contribute on Hacker News ↗