Comment by Kiro
1 year ago
That's something completely different than what the OP suggests and would be a scandal if true (i.e. gpt-3.5-turbo-instruct actually using something else behind the scenes).
1 year ago
That's something completely different than what the OP suggests and would be a scandal if true (i.e. gpt-3.5-turbo-instruct actually using something else behind the scenes).
Ironically it's probably a lot closer to what a super-human AGI would look like in practice, compared to just an LLM alone.
Right. To me, this is the "agency" thing, that I still feel like is somewhat missing in contemporary AI, despite all the focus on "agents".
If I tell an "agent", whether human or artificial, to win at chess, it is a good decision for that agent to decide to delegate that task to a system that is good at chess. This would be obvious to a human agent, so presumably it should be obvious to an AI as well.
This isn't useful for AI researchers, I suppose, but it's more useful as a tool.
(This may all be a good thing, as giving AIs true agency seems scary.)
If this was part of the offering: “we can recognise requests and delegate them to appropriate systems,” I’d understand and be somewhat impressed but the marketing hype is missing this out.
Most likely because they want people to think the system is better than it is for hype purposes.
I should temper my level of impressed with only if it’s doing this dynamically . Hardcoding recognition of chess moves isn’t exactly a difficult trick to pull given there’s like 3 standard formats…
3 replies →
So… we’re at expert systems again?
That’s how the AI winter started last time.
What is an "expert system" to you? In AI they're just series of if-then statements to encode certain rules. What non-trivial part of an LLM reaching out to a chess AI does that describe?
1 reply →
The point of creating a service like this is for it to be useful, and if recognizing and handing off tasks to specialized agents isn't useful, i don't know what is.
If I was sold a product that can generically solve problems I’d feel a bit ripped off if I’m told after purchase that I need to build my own problem solver and way to recognise it…
But it already hands off plenty of stuff to things like python. How would this be any different.
1 reply →
If they came out and said it, I don’t see the problem. LLM’s aren’t the solution for a wide range of problems. They are a new tool but not everything is a nail.
I mean it already hands off a wide range of tasks to python… this would be no different.