← Back to context

Comment by PaulHoule

3 days ago

But where is the value?

If it could write like George Will or Thomas Sowell or Fred Hayek or even William Loeb that would be one thing. But it hears dog whistles and barks which makes it a dog. Except a real dog is soft and has a warm breath, knows your scent, is genuinely happy when you come home and will take a chomp out of the leg of anyone who invades your home at night.

We are also getting this kind of discussion

https://news.ycombinator.com/item?id=44502981

where Grok exhibited the kind of behavior that puts "degenerate" in "degenerate behavior". Why do people expect anything more? Ten years ago you could be a conservative with a conscience -- now if you are you start The Bulwark.

> If it could write like George Will or Thomas Sowell or Fred Hayek or even William Loeb

Having only barely heard of these authors even in the collective, I bet most models could do a better job of mimicking their style than I could. Perhaps not well enough to be of interest to you, and I will absolutely agree that LLMs are "low intelligence" in the sense that they need far more examples than any organic life does, but many of them will have had those examples and I definitely have not.

> We are also getting this kind of discussion

> https://news.ycombinator.com/item?id=44502981

Even just a few years ago, people were acting as if a "smart" AI automatically meant a "moral AI".

Unfortunately, these things can be both capable* and unpleasant.

* which doesn't require them to be "properly intelligent"

  • The bar is "can it write as well as these accomplished professional writers?", not "Can it imitate their style better than the average person?"

    • Why is the bar set that high?

      Writers anyone has heard of are in top ~1k-10k humans who have ever lived, when it comes to "competent writing", out of not just the 8 billion today, but the larger number of all those who came between the invention of writing and today.

      18 replies →