← Back to context

Comment by helsinkiandrew

2 days ago

> Definitely. At at this point, Apple just needs to get anything out the door

They don't though, Android is clearly ahead in AI integration (even Samsung are running TV ads mocking iPhones AI capability) yet still iPhones sales are breaking records - the majority of their phone buyers still prefer an iPhone over an AI capable other phone.

They can take their time to develop AI integration that others can't deploy - 'secure/private', deep integration with iCloud, location services, processing on device etc. that will provide the product moat to increase sales.

The reality is that almost nobody actually wants LLMs in their phones.

They're not good enough for that usecase, currently - so almost all interactions make the UX worse, currently.

Might change in the future, I'm just taking about today in January 2026

  • I think this is the wrong framing. Nobody cares whether there's an LLM in their phone. What people do want is features like improved Siri (still, debatable beyond setting timers) or other improvements, that could potentially come from LLMs.... if they actually work. So far other providers (such as Amazon Alexa) have struggled to deliver a reliable voice assistant powered by LLMs.

  • I’m almost certain even something as ad-hoc as Opus 4.5 with access to iOS native APIs at the level of Siri exposed via MCP would run circles around Siri in January 2026.

    • It would, but it would also result in a bunch of users getting hacked through prompt injection attacks.

    • I strongly agree with this. Frankly even ChatGPT 3.5 could do better. I am baffled that Apple has stuck so stubbornly to whatever insane architecture they have that results in my daily cornucopia of "I'm sorry, I didn't understand" as well as own goals like it forgetting that Apple Music is the only music service I have, or calling a girl I haven't seen in 20 years instead of my wife who has the same first name.

      1 reply →

  • > The reality is that almost nobody actually wants LLMs in their phones.

    I don’t think that’s true. People just use the LLM apps. What people don’t feel like they need right now is deep LLM integration across the whole OS. IMO, that’s more of just not showing people the killer product yet.

    • Live translation? Circle to search? All the magic reframing and relighting stuff in the camera app? I don't know how good the Apple equivalents are, but those are all deeply integrated into Android and are used pretty heavily as far as I can see.

      I don't often use voice assistants myself, but they're fully conversational these days and several billion times more useful than the old-school Alexa-style stuff with a limited set of integrations.

I don’t need a Siri LLM. The current Siri is more than adequate for responding to texts and calling while driving. A lot of the “ai integrations” is marketing material for features nobody will actually use

  • i dont NEED it, but if Siri could actually do anything you could do on your phone it would be very nice.