← Back to context

Comment by Rebuff5007

7 months ago

I know this is the leading narrative, but I actually disagree.

Apple has a wonderful set of products without any form of generative AI, and those products continues to exist. Yes there is opportunity to add some fancy natural-language based search / control, but that seems like relatively low hanging fruit compared to the moat they have and defend.

Will adding gen ai natively to apple products have people non-trivially change the way they use iphones or macs? Probably not. So there is literally no rush here.

It's not about being fancy. My examples are so utterly dull.

Being able to say "turn on the lights in the living room and turn off the fans in the kids' rooms" – is not a crazy use case.

Instead, I literally have to say:

- Turn on Living Room light

- wait

- turn off <son's name> bedroom fan

- wait

- turn off <daughter's name> bedroom fan

Yes, I could actually say "turn off all the fans" (I believe Siri understands that) but that's usually not what I want.

Another example, you have 3-4 timers going: Let's say I'm cooking and have an oven timer, but also have a timer for my kids to stop their device time. I may have a few going. But being able to say "cancel all the timers except the longest one" is TRIVIAL for a first year programmer to implement. But instead, it's a slog with Siri.

  • Actually, what you describe should be feasible with the new on-device foundation models (I haven’t installed the beta myself, but in my close friend group we’ve been suggesting prompts to the couple of brave people who do Switft development, and the foundation models seem able to do that).

    • That's wonderful news. I hope it translates to being baked into the built-in apps like Home/Siri.

  • You could make this slightly easier on your sanity by adding your kids’ rooms to a zone, fyi.

    • I'll try that, thanks. (Probably a zone like "upstairs" to turn off stuff upstairs when we're downstairs all day)

There is a consequence to shifting to LLMs. Despite Siri's reputation, it is a well used product(1), and despite HN's constant noise, Siri actually works very well for the purposes of controlling other apple devices in ways that I've noticed to be far better than Alexa (the other digital assistant that I regularly use).

Switching that to an LLM-based represents a massive increase in computational requirements without pushing the needle for most requests. While fancy, users don't want to sit and have a verbose ChatGPT style conversation with Siri, they just want the command run and done. So this means any advertised change to Siri will need to be sufficiently large such that Siri could seemingly decode any request with minimal or no follow-up questioning, anything short of this will be largely derided, and face the same backlash as current-era Siri.

At the moment siri answers many trivial requests without the use of an LLM. Yes you can speak to siri with relative terms or needs based requests, e.g. saying "It's dark in here" will result in siri turning on the lights in just the room where the request was made(2), even if other receivers in the house heard the request. It's also smart enough to recognise that if you label a room as something like the "office" but later made a request for the "study", it will prompt if actually meant the "office".

The big caveat here is that Siri's abilities change based on the language selected, non-english languages appear to have less flexibility in the type of request and the syntax used. Another factor is that requests during certain peak periods appear to be handled differently, as if there are fall-back levels of AI smarts at the server level. To get around that new Siri will need to be largely offline, which appears consistent with Apple's new AI strategy of local models for basic tasks and complex requests being sent to private cloud compute.

Like Apple Maps, I anticipate the pile-on to Siri will go on far longer than deserved, but what does seem certain is that change is coming.

(1) Apple have stated that Siri is the most used digital assistant. However I have not found any supporting data for this claim other than Apple's own keynote address where the claim was made.

(2) Requires that rooms are set up in homekit and there are per-room based receivers, such as a homepod in each room.

  • >> While fancy, users don't want to sit and have a verbose ChatGPT style conversation with Siri, they just want the command run and done.

    You're absolutely right! Unfortunately Siri is too dumb for even those most basic tasks. Their ML approach failed and they can't admit it.

This approach would be fine if users were empowered to add AI integration they wanted on their own.

They are not though. Absolute control over the platform means Apple has the responsibility to have more vision for the future than anyone else. They do not and will fail to satisfy their users. It will result in either a dramatic change of leadership and strategy and or drive the customers elsewhere.

> Siri, open the east door

< do you want to open - the east door - the west door - all the doors

> Siri, open the east door < opening the east door

They kinda really super suck. Siri used to work better than it does today. It's often a bigger chore than opening the app and tapping the button.

These quirks hit me on a daily basis when all I want to do is control my lights and locks

  • Turn off Apple Intelligence. I got sick of Siri asking, when I asked for the garage door to be opened, if I meant my house in Wyoming or 6th-story apartment in New York (which doesn't have a garage).

    • I think Apple Intelligence is not available on either of my two phones or my one iPad, but still 1/3rd of the time when I say "Siri, stop", the Home Pod in a different room says "Stop the alarm on Ben's iPhone 2022?", while the actual device 2m away that I don't want to touch (because I'm cooking) carries on ringing its alarm or timer.

      Or, even weirder, when the phone itself does stop and the Home Pod says that and then follows up with "Hello?" because I didn't answer because the phone alarm stopped.

Obviously reasonable minds may disagree. And i do i disagree with your disagree-al. Your reasonable response necessarily stems from a foundation that llms are just stochastic parrots incapable of non-trivially changing someone's usage. That isn't true, and has been shown to be untrue in many domains at this point. And that's only from the chatbot form of llms. Tool usage and agents will reveal more new paradigms

If OpenAI released a phone Apple’s sales will be down 50%.

At this point only a handful of apps that are irreplaceable are propping iOS up and that won’t last.

  • I highly doubt that OpenAI is capable of releasing a full phone that isn't just a reskin of a generic Android with "AI". IOS design sucks (imo) and its app selection is much less than Android but that's not what makes people buy iPhone. It's simple familiarity and marketing. I'll definitely be switching off my iPhone when it breaks but that'll probably take at least a decade. Phones are pretty much feature complete at this point - for a normal person there's almost no reason to upgrade.

  • What data backs this take of yours?

    What irreplaceable apps are propping up iOS? What's the data showing that 50% of iPhone users are basically just begging to get off the platform?

    • > What irreplaceable apps are propping up iOS?

      EV charging and smart ID (aka gov id). To lesser extent - anything thats a smart hardware - home automation, cameras, vacuums, cars, drones, etc. Then there’s services - namely ride hailing, escooter hire. Plenty of mobile webapps are pretty crippled too so you download apps.

      3 replies →