Comment by cush
3 days ago
> Gemini is objectively a good model (whether it's #1 or #5 in ranking aside) so Apple can confidently deliver a good (enough) product
Definitely. At at this point, Apple just needs to get anything out the door. It was nearly two years ago they sold a phone with features that still haven't shipped and the promise that Apple Intelligence would come in two months.
Yes but they also haven’t generated spicy deep fakes and talked kids into suicide with their products.
It’s just how Apple does things: They still have no folding phone, under-screen finger print scanner, under-screen front-cam, etc.
Apple is always behind on industry trends, but when they adopt them eventually, they become mainstream and cool. This is what will happen with the folding phones this year, if rumors are true.
Folding phones have been around for half a decade and sold tens of millions of units. Same with VR.
Apple is in the value extraction business these days: their devices are conduits for advertising Apple services. The Vision Pro flopped because they wanted to charge and arm and a leg for a platform that was actively hostile to developers. It's not 2008 anymore.
> but when they adopt them eventually, they become mainstream and cool
When was this part last true?
29 replies →
> Apple is always behind on industry trends
Huh, I always thought it was the other way around (whether people liked it or not): ditching floppy disks, ditching cdroms, prioritizing BT over wired earphones, etc. I am glad, though, that they were forced to stick with USB-C if I'm not mistaken.
6 replies →
They did still overpromise and that should not be the way Apples does things (although it was hardly the first time; the AirPower mat was announced in 2017).
To be fair, all tech companies do this. Sell first, implement later, hype hype hype. Of course we’d like to think Apple was better, but well.. it isn’t.
2 replies →
Whataboutism doesn’t justify what Apple did. They took billions of dollars from consumers using demos of products those consumers never received.
> At at this point, Apple just needs to get anything out the door
To the extent Cupertino fucked up, it's in having had this attitude when they rolled out Apple Intelligence.
There isn't currently a forcing function. Apple owns the iPhone, and that makes it an emperor among kings. Its wealth is also built on starting with user problems and then working backwards to the technology, versus embracing whatever's hot and trying to shove it down our throats.
Lately, they've arguably been starting from their own priorities (i.e. pushing and protecting their "services" revenue at all cost) and working backwards to an acceptable user experience from there.
> Its wealth is also built on starting with user problems and then working backwards to the technology, versus embracing whatever's hot and trying to shove it down our throats.
Then again, remember millimeterwave? But yes, as a general rule I think your point still stands.
> Its wealth is also built on starting with user problems and then working backwards to the technology
Since when?
> versus embracing whatever's hot and trying to shove it down our throats
I agree here, to a degree. It's just that Apple tells its customers what's hot and then shoves it down their throats.
> Apple tells its customers what's hot and then shoves it down their throats
I don't really understand this. Is it shoving when something is actually popular? The iPod was legitimately extremely popular. Did Apple decide it was hot and then somehow force people to buy 450 million of them?
I mean I'm just curious what products you're thinking of when you say "shoves it down their throats"
1 reply →
> There isn't currently a forcing function.
Investors are the forcing function
> There isn't currently a forcing function
Sorry but if there wasn’t a forcing function then “Apple Picks Gemini to Power Siri” wouldn’t be the headline
> if there wasn’t a forcing function then “Apple Picks Gemini to Power Siri” wouldn’t be the headline
A pair four-trillion dollar companies striking a deal in the hottest technology space since the internet getting headline treatment is not evidence of a forcing function.
9 replies →
> Definitely. At at this point, Apple just needs to get anything out the door
They don't though, Android is clearly ahead in AI integration (even Samsung are running TV ads mocking iPhones AI capability) yet still iPhones sales are breaking records - the majority of their phone buyers still prefer an iPhone over an AI capable other phone.
They can take their time to develop AI integration that others can't deploy - 'secure/private', deep integration with iCloud, location services, processing on device etc. that will provide the product moat to increase sales.
The reality is that almost nobody actually wants LLMs in their phones.
They're not good enough for that usecase, currently - so almost all interactions make the UX worse, currently.
Might change in the future, I'm just taking about today in January 2026
I think this is the wrong framing. Nobody cares whether there's an LLM in their phone. What people do want is features like improved Siri (still, debatable beyond setting timers) or other improvements, that could potentially come from LLMs.... if they actually work. So far other providers (such as Amazon Alexa) have struggled to deliver a reliable voice assistant powered by LLMs.
I’m almost certain even something as ad-hoc as Opus 4.5 with access to iOS native APIs at the level of Siri exposed via MCP would run circles around Siri in January 2026.
3 replies →
> The reality is that almost nobody actually wants LLMs in their phones.
I don’t think that’s true. People just use the LLM apps. What people don’t feel like they need right now is deep LLM integration across the whole OS. IMO, that’s more of just not showing people the killer product yet.
1 reply →
I don’t need a Siri LLM. The current Siri is more than adequate for responding to texts and calling while driving. A lot of the “ai integrations” is marketing material for features nobody will actually use
i dont NEED it, but if Siri could actually do anything you could do on your phone it would be very nice.
> It was nearly two years ago
Just under 16 months since the release of iOS 18. The phones they would have sold this with shipped alongside 18.
Also, the personalized Siri was indicated it would not be available until later and was expected in the spring release (March 2025).
What are the top 3 features you’re missing right now?
I'll bite
1. Have a user interface. Sometimes I'll ask a question and Siri actually provides a good enough answer, and while I'm reading it, the Siri response window just disappears. Siri is this modal popup with no history, no App, and no UI at all really. Siri doesn't have a user interface, and it should have one so that I can go back to sessions and resume them or reference them later and interact with Siri in more meaningful ways.
2. Answer questions like a modern LLM does. Siri often responds with very terse web links. I find this useful when I'm sitting with friends and we don't remember if Lliam Neeson is alive or not - for basic fact-checking. This is the only use case where it's useful I've found, when I want to peel my attention away for the shortest period of time. If ChatGPT could be bound to a power button long-press, then I'd cease to use Siri for this use case. Otherwise Siri isn't good for long questions because it doesn't have the intelligence, and as mentioned before, has no user interface.
3. Be able to do things conversationally, based on my context. Today, when I "Add to my calendar Games at Dave's house" it creates a calendar entry called "Games" and sets the location to a restaurant called "Dave's House" in a different country. My baseline expectation is that I should be able to work with Siri, build its memory and my context, and over time it becomes smarter about the things I like to do. The day Siri responds with "Do you mean Dave's House the restaurant in another country, or Dave, from your contacts?" I'll be happy.
For 1, I think we are getting farther away from this.
Siri's current architecture now provides context into the prompt, such as the app/window that has focus and the content loaded into it. In that sense, Siri is more like the MacOS menu bar than an app. A consolidated view of Siri history may look disjointed, in that there is a lot of context hidden if all it shows is a query like "when was this building built?".
Even more so, it might not provide the functionality desired if you go look at historic chats and ask "who was the architect?", unless all that context was actually captured. However, that context was never formatted in a way that was intended to be clearly displayed to the user. That in itself creates a lot of challenges around things like user consent since Siri can farm off queries to other (online) tools and world-knowledge AI services.
There is at least a UX paradigm for this - clipboard history. Coincidentally, Tahoe built clipboard history into Spotlight. But clipboard history lends itself to perhaps being more a complete and self contained snapshot. I'm not sure Siri is being built to work this way because of implicit context.
For 2, at a certain point this gets farmed off to other tools or other AI services. The Gemini agreement is for the foundational model, not large "world knowledge" models or backing databases. Today, Siri answers this question by providing bibliographical information inline from Wikipedia, using internal tools. The model itself just isn't able to answer the actual question (e.g. it will just say his birthday).
For 3, the model already has substantial personal context (as much as apps are willing to give it) and does have state in between requests. This is actually one of the issues with Siri today - that context changes the behavior of the command and control engine in interesting ways, phone to phone and sometimes moment to moment.
Unfortunately, I think stopping and asking for clarification is not something generative AI currently excels at.
Thanks for sharing. 1. Could be fixed today. 2./3. need a good enough LLM.
btw: I hope you will visit Dave's House someday in the future.
My wife and I got a kick out of your “Games at Dave's house” example. Thanks for sharing
1 reply →
>If ChatGPT could be bound to a power button long-press, then I'd cease to use Siri for this use case
This should be possible, go to Settings->Action Button->Controls and search for ChatGPT
Isn’t its voice the ui? It should respond using the same context of the request. Voice and natural language.
If you ask for a website it should open a browser.
Edit: everything else spot on
1 reply →
Yes, the lack of context or history has annoyed me in the past too.
Also, Liam Neeson just catching strays over here
I'm sorry, I can't answer that right now.
Would you like to click this button which takes what you said and executes it as a Google search in Safari?
Now playing You're Missing, by Bruce Springsteen on Apple Music
2 replies →
Siri to function above the level of Dragon NaturallySpeaking '95
Fantastic reference. I remember pirating this from microcrap.com in about 1996.
9 replies →
ANY ability to answer simple questions without telling me to open Safari and read a webpage for myself...?
I should be able to completely control my phone with voice and ask it to do anything it is capable of and it should just work:
"Hi Siri, can you message Katrina on WhatsApp that Judy is staying 11-15th Feb and add it to the shared Calendar, confirm with me the message to Kat and the Calendar start and end times and message."
They will never do this, and the lack of it can be marketed as a security feature.
1 reply →
Could it just fucking work? "Hey Siri turn on the [room name] room lights" and it gives me a positive chime and ... doesn't turn any lights on? In any of my rooms?
Judging by another comment, it probably turned the lights on in a restaurant in a different country.
1 reply →
Apple’s insistence on not ever displaying error messages is infuriating.
Elementary anti-spam.
I still think Apple should, at least to Apple One customers, offer small, private models, trained on your personal imessage, image and video archives in icloud. With easy-to-use, granular controls for content inclusion/exclusion.
Will make it much easier to find those missing pictures from a few years ago...
I consider Apple to be practical, Also Apple will be running Gemini on its own hardware. This is better than Buying perplexity and running chinese model on which Perplexity runs. Training Models is a money on game, Its better to rent models than training your own. If everyone is training models they are going to be come commodity, also this is not the final architecture.
The iPhone 16 was released 16 months ago, not “nearly” 24 months.
Well it ain't coming now either if it's just Gemini, is it?