Comment by brianmcc

2 years ago

What winds me up is the mis-branding, sometimes deliberate sometimes not (which one is worse?!), of basic computer processing as "AI".

It's not AI it's an IF statement for crying out loud :-(

But this is the industry we're in, and buzzword-driven headlines and investment are how it goes.

Actual proper AI getting some attention makes a pleasant change tbh :-)

I disagree; consider the use of the term "video game AI", which historically at least has just been a bunch of _if_ statements chained together. This is totally valid, it's an example of AI without machine learning.

The thing is that AI is just about the most general term for the type of computing that gives the illusion of intelligence. Machine learning is a more specific region of the space of AI, and generally is made of statistical models that lead to algorithms that can train and modify their behavior based on data. But this includes "mundane" algorithms like k-means clustering or line-fitting. Deep learning (aka neural networks) is yet a more specific subfield of ML.

I think the term AI just has more "sex appeal" because people confuse it with the concept of AGI, which is the holy grail of machine intelligence. But we don't even know if this is achievable, or what technology it will use.

So in terms of conceptual spaces, we can say that AI > ML > DL, and we can say (by definition) that AI > AGI. And it seems very likely that AGI > ML. But it's not known, for instance, whether AGI > DL, ie, we don't know for sure that deep learning/neural networks are sufficient to obtain AGI.

In any case, people should put less weight on the term AI, as it's a pretty low bar. But also yes, the term is way over hyped.

  • I'm thinking of cases such as colleagues selling as "ML" something they were then forced to admit as "we use SQL to pick out instances of this specific behaviour we knew was happening". Embarrassing all round.

    As folks that work in tech we can tell the difference between stuff that's got some form of depth to it in "proper" AI: ML, DL, AGI as you suggest, vs the over-hyped basic computation stuff. And the selling of the latter as the former can rankle.

I feel like AI-scientists themselves are partially to blame for this. For starters, AI does not 'learn' like a human learns. But still many of the main terms of the field are based on learning: terms like 'learning rate', 'neural networks', or 'deep learning' are implying that there's some kind of being which learns, not just a very complicated decision tree. It's not all the fault of hype marketing people!

  • > AI-scientists themselves are partially to blame for this

    They are not addressing the public or swaying opinion