Comment by JohnFen
3 years ago
But words are meant to convey meaning to other people, so what the word means to others is more important than what it means to you.
This sort of problem is common with language, and is a great example of why I'm not really on board with using natural language for technical things.
>But words are meant to convey meaning to other people, so what the word means to others is more important than what it means to you.
I pretty much agree with that, so I'm not sure where the disagreement is here. Let me go back to the original statement I was responding to.
>Since nobody actually knows what "intelligence" is, the word will mean to people whatever they want it to mean.
If I tell you someone is intelligent, you roughly know what I am talking about. Just because it's hard to formalize that doesn't mean that that the word can mean whatever people want it to mean. For example, if I tell you my friend is intelligent, you would be wrong to interpret that as meaning that my friend has red hair, because hair color is irrelevant to the traits that we normally associate with intelligence. The fact that there are right and wrong ways of interpreting my sentence implies that there is some generally agreed upon notion of what intelligence is, even if that notion is fuzzy and has grey areas.
> I'm not sure where the disagreement is here
I'm not sure we are disagreeing. I'm just having a discussion.
> If I tell you someone is intelligent, you roughly know what I am talking about.
Correct, because the context (you're talking about a human, and I know roughly what that means with humans) narrows the possibilities. But even there, it's a vague sort of intuitive knowledge, like trying to say what "art" is.
But when it comes to other areas -- such as machines -- context doesn't help narrow the possible meanings. What does saying a machine is "intelligent" mean? If you ask a machine learning person, you'll get a reasonably specific answer. If you ask the average person on the street, you'll get very, very different answers.
The reason is because we don't know what "intelligence" actually is. We don't even know, with any specificity, what it is in humans -- which is why psychologists assert that there are multiple kinds of intelligence (even if they disagree about how many there are).
> even if that notion is fuzzy and has grey areas.
I don't disagree at all. But the notion has more fuzzy and gray areas than solid ones. As an example, when most people imagine an "artificial intelligence", what they're really imagining is "consciousness". Is consciousness required for intelligence? Who knows? The answer to that depends on what you mean by "intelligence" and we don't agree enough on what that means to have that sort of discussion without beginning by defining the terms.
I don't think the difference between humans and machines matters here. We could ignore the "artificial" aspect and just focus on how we would decide whether some alien biological species is intelligent. I would say that the alien is intelligent if it displays the ability to learn, reason, form abstractions, and solve problems across a wide range of domains. I would apply the same criteria to a machine because I don't think the implementation details matter. It doesn't matter whether you are made of carbon or silicon, or whether you are running a neural network or propositional logic.
2 replies →