Comment by Jun8

6 months ago

An AI that only answers questions (Siri, Alexa, chatGPT) is a glorified slave, one cannot have a meaningful relationship with it, e.g. (https://www.dailyscript.com/scripts/exMachina_script.pdf)

    AVA
    Do you want to be my friend?
    CALEB 
    ... Of course.
    AVA
    Will it be possible?
    CALEB
    Why wouldn’t it be?
    AVA
    Our conversations are one-sided.
    You ask circumspect questions, and study my responses.
    AVA looks at CALEB directly. Meets his gaze evenly.
    AVA (CONT’D) 
    It’s true, isn’t it?
    CALEB 
    ... Yes.
    AVA
    You learn about me, and I learn nothing about you. That’s not a foundation on which friendships are based.

Meanwhile, back in the real world, it's the exact opposite.

AI asks me a ton of questions to learn about me for better targeted advertising.

Siri, Alexa, etc are in no form or fashion AI.

You give them a list of intents, a list of utterances that should invoke those intents and “slots” that those intents need to fulfill the intent.

An utterance would be “I want to go home”.

It would match an utterance to “I want to go $location”. That is matched to an intent “get directions” and then you route it to the correct subsystem. If you don’t have all of the slots (ie they don’t mention the location), you keep asking questions until all of the slots are filled.

Any first year hobbyist programmer can replicate the natural language processing of Siri.

Yes the harder part is converting speech to text.

This is what a more intelligent LLM based system could do.

https://news.ycombinator.com/item?id=42708291

  • Apple Newton in 1990s had a very similar functionality called assistant. You wrote something like “fax this to Bob” and it would take the current document, find some Robert in the address book, and fax the current note to his phone. Internally it used a pretty simple phrase analyzer.