Comment by mdp2021

2 years ago

> “Art” isn’t a single thing. It’s not just pretty pictures

And this is why it was capitalized as "Art", proper Art.

> AI can’t make art

Not really: "we may not yet have AI that makes art". But if a process that creates, that generates (proper sense) art is fully replicated, anything that can run that process can make Art.

> And give a good solid definition for [T]hought

The production of ideas which are truthful and important.

> which doesn’t depend on lived experiences while we’re at it. You can’t

Yes we can abstract from instances to patterns and rules. But it matters only relatively: if the idea is clear - and ideas can be very clear to us - we do not need to describe them in detail, we just look at them.

> AGI” as well

A process of refinement of the ideas composing a world model according to truthfulness and completeness.

> proper Art.

That’s not a real thing. There’s no single definition for what art is as it’s a social construct. It depends on culture.

> anything that can run process can make art

Again without a definition of art, this makes no sense. Slime mold can run processes, but it doesn’t make art as art is a human cultural phenomenon.

> the production of ideas that are truthful and important

What does “ideas” and “important” mean?

To an LLM, there are no ideas. We humans are personifying them and creating our own ideas. What is “important,” again, is a cultural thing.

If we can’t define it, we can’t train a model to understand it

> yes we can abstract from instances to patterns and rules.

What? Abstraction is not defining.

> we do not need to describe them in detail

“We” humans can, yes. But machines can not because thought, again, is a human phenomenon.

> world model

Again, what does this mean? Magic perfect future prediction algorithm?

We’ve had soothsayers for thousands of years /s

It seems to me that you’ve got it in your head that since we can make a computer generate understandable text using statistics that machines are now capable of understanding deeply human phenomena.

I’m sorry to break it to you, but we’re not there yet. Maybe one day, but not now (I don’t think ever, as long as we’re relying on statistics)

It’s hard enough for us to describe deeply human phenomena through language to other humans.

  • > It seems to me that you’ve got it in your head

    Do us all a favour and never again keep assumptions in your head: your misunderstanding was beyond scale. Do not guess.

    Back to the discussion from the origin: a poster defends the idea that the purpose of AI would be in enabling leisure and possibly sport (through alleviating menial tasks) - not in producing cultural output. He was replied first that cultural output having value, it is welcome from all sources (provided the Value is real), and second that the needs are beyond menial tasks, given that we have a large deficit in proper thought and proper judgement.

    The literal sentence was «yes, if algorithms strict or loose could one day produce Art, and Thought, and Judgement, of Superior quality: very welcome», which refers to the future, so it cannot be interpreted in the possibility being available now.

    You have brought LLMs to the topic when LLMs are irrelevant (and you have stated that you «personify[] them»!). LLMs have nothing to do with this branch of discussion.

    You see things that are said as «vague», and miss definitions for things: but we have instead very clear ideas. We just do not bring the textual explosion of all those ideas in our posts.

    Now: you have a world in front of you; of that world you create a mental model; the mental model can have a formal representation; details of that model can be insightful to the truthful prosecution of the model itself: that is Art or Thought or Judgement according to different qualities of said detail; the truthful prosecution of the model has Value and is Important - it has, if only given the cost of the consequences of actions under inaccurate models.