Comment by techblueberry
25 days ago
I think what we have is mostly AGI. It’s artificial , it’s intelligence, and most important it’s general. It may never get an IQ about 75 or so, but it’s here.
25 days ago
I think what we have is mostly AGI. It’s artificial , it’s intelligence, and most important it’s general. It may never get an IQ about 75 or so, but it’s here.
Yeah, LLMs fulfill any goalpost I had in my mind years ago for what AGI would look like, like the starship voice AI in Star Trek, or merely a chat bot that could handle arbitrary input.
Crazy how fast people acclimate to sci-fi tech.
The Mass Effect universe distinguishes between AI, which is smart enough to be a person—like EDI or the geth—and VI (virtual intelligence), which is more or less a chatbot interface to some data system. So if you encounter a directory on the Citadel, say, and it projects a hologram of a human or asari that you can ask questions about where to go, that would be VI. You don't need to worry about its feelings, because while it understands you in natural language, it's not really sentient or thinking.
What we have today in the form of LLMs would be a VI under Mass Effect's rules, and not a very good one.
Note that Mass Effect's world purposely muddles the waters between the two and blurs lines. "Is this a VI or a real AI" is an open question in cases so that the player can explore the idea.
Halo also builds a distinction, with "Smart AI" what we would generally consider AGI and even super AGI, against "Dumb AI" which is purposely limited. Similarly, our current LLMs are similar to "Dumb AI" in shape but not even remotely close in capability.
In both universes, an "AI" or similar system will not hallucinate. If they tell you something wrong, or inaccurate, it's usually because they have been tampered with or because they have "Gone crazy" which is an identifiable state that is not normal and not probabilistic.
Star Trek also makes distinctions. The ships computer for example largely does not make deductions, and doesn't always operate in natural human language but instead requires you use specific phrasing and language. The Star Trek ships computer is basically what using 20 year old text to speech to run Wikipedia and database queries, and that's mostly it. It cannot analyze data itself. Data and the fully conscious Sherlock Holmes are both capable of automatically forming and testing a hypothesis.
It's actually weird how many people don't seem to notice that. The ships computer in star trek is purposely dumb, and command driven. It is not an agent, it does not think, and it does not understand natural human language. We had the star trek ships computer decades ago.
Peter F Hamiltons Sci-Fi novels, do something similar they differentiate between SI (Sentient Intelligence) which is basically their own being, and is not used by people as it would be essentially slavery. And for General Purpose "AI" they use RI which is Restricted Intelligence with strict limits placed around them.
1 reply →
This is a great analogy.
The term AGI so obviously means something way smarter than what we have. We do have something impressive but it’s very limited.
2 replies →
> I think what we have is mostly AGI.
I agree that the term AGI is the problem. If I have something as intelligent as a mouse that should be AGI, if I have something a intelligent as a bird that should be AGI. Same is it's as intelligent as a 2 year old human, or someone with an IQ of 75. Those would all clearly be Artificial, General Intelligences.
But the problem is the term AGI also oddly has this bar that if must be equal to our better than human (a standard that the majority of humans would fail based on definition of intelligence alone). Plus multidisciplinary better than all humans (which it would take a super genius to have a human accomplish).
Given the current definition of you took a bright high schooler and made them artificial they wouldn't count as AGI which makes the definition silly.
And that is separate from the entire concept of sentience - which it's unclear if it's a requirement for intelligence.
It's all a bunch of squishy definitions mashed together.