← Back to context

Comment by yifanl

6 months ago

We don't have a formal classification of which technologies can be considered "AI", but computer vision would feel like a valid entrant to me.

I thought AI meant "ML" + marketing.

I joke, but not. I'm a researcher and AI has been a pretty ambiguous term for years, mostly because intelligence is still not well defined. Unfortunately I think it's becoming less well defined in the last few years (while prior to that was getting better defined) via the (Fox) Mulder Effect.

Computer vision totally qualifies as AI as it can grant an agent artificially intelligent behavior.

  • The fuck it does.

    for it to be AI, it needs some sort of ML basis. otherwise its just fancy "classical" computer vision.

    (this is from someone who's been working in the field for far too long, and remembers a time before "deep", "ML" and "ai" were part of every paper. )

  • Based on what is said in the article, it seems like a VERY simple algorithm. It clusters the pixels in the image by color and reports any small blobs of unusual color. That's not AI by any of the stupid definitions we've come up with recently.

Maybe? I am currently going through 'artificial intelligence modern approach' by Russel&Norvig and from historical perspective alone, it seems vision would qualify.

It is just that the language drifted a little the way it did with cyber meaning something else to post 90s kids. So now AI seems to be mostly associated with llms, but not that long ago, AI seemed to almost include just use to of an algorithm.

I am not an expert in the field at all. I am just looking at stuff for personal growth.

https://en.wikipedia.org/wiki/AI_effect

  • I thought that AGI covered that. AGI to my mind doesn’t have to surpass human thinking. It just has to be categorically the same as it (it can be less powerful, or more). It has to be general. A chess machine in a box which can’t do anything else is not general.[1]

    I’ve always been fine with calling things AI even though they are all jumbles of stats nonsense that wouldn’t be able to put their own pants on. Does a submarine swim? No, but that’s just the metaphor that the most vocal adherents are wedded to (at the hips). The metaphor doesn’t harm me. And to argue against it is like Chomsky trying to tell programming language designers that programming languages being languages is just a metaphor.

    [1] EDIT: In other words it can be on the level of a crow. Or a dog. Just something general. Something that has some animalistic-like intelligence.

    • I think the point of the Wikipedia article is that human categories are flexible, and they get redefined to suit human ego needs regardless of what's happening in the objective outside world.

      Say that you have a closed system that largely operates without human intervention - for example, the current ad fraud mess where you have bots pretending to be humans that don't actually exist to inflate ad counts, all of which gets ranked higher by the ML ad models because it inflates their engagement numbers, but it's all to sell products that don't really work anyway so that the company can post better revenue numbers to Wall Street and unload the shares on prop trading bots and index funds that are all investing algorithmically anyway. On some level, this is a form of "intelligence" even though it doesn't put pants on. For that matter, many human societies don't put pants on, nor do my not-quite-socialized preschool kids. It's only the weight of our collective upbringing, coupled with a desire to feel intelligent, that leads us to equate putting pants on with intelligence. Plenty of people don't put pants on and consider themselves intelligent as well. And the complexity of what computers actually do do is often well beyond the complexity of what humans do.

      I often like to flip the concept of "artificial intelligence" on its head and instead think about "natural stupidity". Sure, the hot AI technologies of the moment are basically just massive matrix computations that statistically predict what's likely to come next given all the training data they've seen before. Humans are also basically just massive neural networks that respond to stimulus and reward given all the training data they've seen before. You can make very useful predictions about, say, what is going to get a human to click on a link or open their wallet using these AI technologies. And since we too are relatively predictable human machines that are focused on material wealth and having enough money to get others to satisfy our emotions, this is a very useful asset to have.

      1 reply →

Computer vision is a field of AI. But this is just an algorithm without any sort of learning or training process.

  • They might have needed to learn what a good difference threshold and cluster size is. It's hardly ML like fine-tuning CLIP embeddings is, but there are few solid differences: both explore visual embedding spaces with learned values. Granted, cluster thresholds are more likely to be manually learned, but they are both embedding spaces, with the main difference being dimensionality.

    It's very vague for Wired to have used AI in the title, but it's more confusing to say "A previous headline on this piece incorrectly stated that the drone software used AI." - and not obviously correct either.

    • No, the problem is that the human programmers are the ones doing the learning. That's not artificial intelligence, that's just regular human learning.

      The algorithm is: identify pixels that are chromatically different from the surrounding pixels. And that's it. That's not AI, that's an algorithm. Any changes come from the human programmers manually changing the algorithm, not from any self-increased capabilities acquired through machine learning, etc.

      1 reply →

  • ML =/= AI.

    Machine learning was widely considered to be a subset of AI, until it got a big resurgence almost 2 decades ago. Now some people use the terms interchangeably.

No, even before the current AI era classical computer vision was not considered to be "AI"... because it isn't. That's just a fact.