Apple picks Gemini to power Siri

3 days ago (cnbc.com)

If nothing else, this was likely driven by Google being the most stable of the AI labs. Gemini is objectively a good model (whether it's #1 or #5 in ranking aside) so Apple can confidently deliver a good (enough) product. Also for Apple, they know their provider has ridiculously deep pockets, a good understanding and infrastructure in place for large enterprises, and a fairly diversified revenue stream.

Going with Anthropic or OpenAI, despite on the surface having that clean Apple smell and feel, carries a lot of risk Apple's part. Both companies are far underwater, liable to take risks, and liable to drown if they even fall a bit behind.

  • > Gemini is objectively a good model (whether it's #1 or #5 in ranking aside) so Apple can confidently deliver a good (enough) product

    Definitely. At at this point, Apple just needs to get anything out the door. It was nearly two years ago they sold a phone with features that still haven't shipped and the promise that Apple Intelligence would come in two months.

    • Yes but they also haven’t generated spicy deep fakes and talked kids into suicide with their products.

      It’s just how Apple does things: They still have no folding phone, under-screen finger print scanner, under-screen front-cam, etc.

      43 replies →

    • > It was nearly two years ago

      Just under 16 months since the release of iOS 18. The phones they would have sold this with shipped alongside 18.

      Also, the personalized Siri was indicated it would not be available until later and was expected in the spring release (March 2025).

    • > At at this point, Apple just needs to get anything out the door

      To the extent Cupertino fucked up, it's in having had this attitude when they rolled out Apple Intelligence.

      There isn't currently a forcing function. Apple owns the iPhone, and that makes it an emperor among kings. Its wealth is also built on starting with user problems and then working backwards to the technology, versus embracing whatever's hot and trying to shove it down our throats.

      16 replies →

    • > Definitely. At at this point, Apple just needs to get anything out the door

      They don't though, Android is clearly ahead in AI integration (even Samsung are running TV ads mocking iPhones AI capability) yet still iPhones sales are breaking records - the majority of their phone buyers still prefer an iPhone over an AI capable other phone.

      They can take their time to develop AI integration that others can't deploy - 'secure/private', deep integration with iCloud, location services, processing on device etc. that will provide the product moat to increase sales.

      10 replies →

    • I still think Apple should, at least to Apple One customers, offer small, private models, trained on your personal imessage, image and video archives in icloud. With easy-to-use, granular controls for content inclusion/exclusion.

      Will make it much easier to find those missing pictures from a few years ago...

    • I consider Apple to be practical, Also Apple will be running Gemini on its own hardware. This is better than Buying perplexity and running chinese model on which Perplexity runs. Training Models is a money on game, Its better to rent models than training your own. If everyone is training models they are going to be come commodity, also this is not the final architecture.

  • Nothing about OpenAI is clean. Their complete org is controlled by Altmann, who was able to rehire himself after he was fired.

    Anthropic doesn't have a single data centre, they rent from AWS/Microsoft/Google.

    • Also: they've dealt with Google for default search engine deals before even through the early Android/iPhone competition days (and for early iPhone Weather, Stocks, etc.) It's a familiar enough dynamic.

  • I was more thinking about this being driven by the fact that Google pays Apple $20B a year for being the pre-selected search engine and this way, Apple still gets $19B and a free AI engine on top.

    • It was 20 billion dollars years ago, 2022. There's little doubt it's closer to $25B now, perhaps more.

  • Counterpoint: iOS’s biggest competitor is Android. They are now effectively funding their competition on a core product interface. I see this as strategically devastating.

    • Counterpoint: Google is paying Apple $20b/year to keep themselves as the default search engine in iOS. Android's biggest competitor is iOS. They are now effectively funding their competition on a core product interface. I see this as strategically devastating.

      1 reply →

    • It's strategically devastating because no small number of users choose Apple because they do not trust Google and now they have no choice but to have Google AI on-board their machines.

      I respect Google's engineering, and I'm aware that fundamental technologies such as Protocol Buffers and FlatBuffers are unavoidably integrated into the software fabric, but this is is avoidable.

      I'm surprised Google aren't paying Apple for this.

      7 replies →

    • Is android really iOSs competition ? I feel like the competition is less android more vendors who use android. Every android phone feels different. Android doesn’t even compete on performance anymore the chips are quite behind. The target audience of the two feels different lately.

      2 replies →

  • Yup, Anthropic has constant performance problems (not enough GPU), OpenAI is too messy with their politics and Altman.

  • True. Also Gemini is the boring model, heavily sanitised for corporate applications. At least it admits this if you press it. It fits Apple here very well.

    Personally I wouldn't use it, it still belongs to an advertiser specialised on extracting user information. Not that I expect that other AI companies value privacy much higher. But clean smell also means bland smell.

    • I suspect you're exactly right about it being the most sanitized model.

      I don't however like the idea of having Google deeply embedded in my machine and Siri will definitely be turned off when this happens. I only use Siri as an egg timer anyway.

      This seems like a odd move for a company that sells privacy.

      1 reply →

    • Google, as the designer of the original transformer, is designer of the original "mechanism" for inserting ads into a prompt answer in realtime to the highest bidder, so it makes sense from that part too.

      Given my stance about AI, I'll definitely not use it, but I understand Apple's choice. Also this choice will give them enough time to develop their infrastructure and replace parts of it with their own, if they are planning to do it.

      > Not that I expect that other AI companies value privacy much higher.

      Breaching privacy and using it for its own benefit is AIs business model. There are no ethical players here. Neither from training nor from respecting their users' privacy perspective. Just next iteration of what social media companies do.

  • >If nothing else, this was likely driven by Google being the most stable of the AI labs.

    I dont think the model is that much different if they thought Siri was half decent enough for so long.

    Judging from the past 10 years, I would say this is more likely driven by part of a bigger package deal with Google Search Placement and Google Cloud Services. When everything else being roughly equal.

    Instead of raising price again Paying Apple even more per user, How about we pay the less but throw in Gemini with it?

    Apple has been very good, if not the best at picking one side and allowing the others to fight for its contract. They dont want Microsoft to win the AI race, at the same time Apple is increasing the use of Azure just in case. Basically playing the game of leverage at its best. In hindsight probably too well into it they forgot what the real purpose of all these leverage are for, not cost savings but ultimately better quality product.

    • > I would say this is more likely driven by part of a bigger package deal with Google Search Placement and Google Cloud Services.

      Can the DOJ and FTC look into this?

      Google shouldn't be able to charge a fee on accessing every registered trademark in the world. They use Apple get get the last 30% of "URL Bars", I mean Google Search middlemen.

      Searching Anthropic gets me a bidding war, which I'm sure is bleeding Google's competition dry.

      We need a "no bare trademark (plus edit distance) ads or auto suggest" law. It's made Google an unkillable OP monster. Any search monopoly or marketplace monopoly should be subject to not allowing ads to be sold against a registered trademark database.

      1 reply →

  • I agree with your point about Google being more stable company then the rest so the decision probably makes sense. But there was a study done by multiple news companies in Czechia by asking about news topics and Gemini was consistently the worst in citations and straight up being incorrect (76% of its answers had "issues", I don't have exact issues specification).

  • It has nothing to do with how good Gemini is relative to others. Apple is picking Gemini because they don’t want AI to be the selling point for Android phones. Apple execs do not care about innovations. They only care about keeping their monopoly intact.

  • With Anthropic or OpenAI they would have had to pay for it, but Google already pay them $20bn+ per year to be the default search engine - so they just knock $1bn off Google's bill for Gemini

    • What's the difference if Apple gets $20B from Google and spends $1B to another company or just gets $19B from Google and doesn't spend nothing?

      1 reply →

The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

Apple has the best edge inference silicon in the world (neural engine), but they have effectively zero presence in a training datacenter. They simply do not have the TPU pods or the H100 clusters to train a frontier model like Gemini 2.5 or 3.0 from scratch without burning 10 years of cash flow.

To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

It's a smart move. Let Google burn the gigawatts training the trillion parameter model. Apple will just optimize the quantization and run the distilled version on the private cloud compute nodes. I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

  • > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

    Setting aside the obligatory HN dig at the end, LLMs are now commodities and the least important component of the intelligence system Apple is building. The hidden-in-plain-sight thing Apple is doing is exposing all app data as context and all app capabilities as skills. (See App Intents, Core Spotlight, Siri Shortcuts, etc.)

    Anyone with an understanding of Apple's rabid aversion to being bound by a single supplier understands that they've tested this integration with all foundation models, that they can swap Google out for another vendor at any time, and that they have a long-term plan to eliminate this dependency as well.

    > Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own.

    I'd be interested in a citation for this (Apple introduced two multilingual, multimodal foundation language models in 2025), but in any case anything you hear from Apple publicly is what they want you to think for the next few quarters, vs. an indicator of what their actual 5-, 10-, and 20-year plans are.

    • My guess is that this is bigger lock-in than it might seem on paper.

      Google and Apple together will posttrain Gemini to Apple's specification. Google has the know-how as well as infra and will happily do this (for free ish) to continue the mutually beneficial relationship - as well as lock out competitors that asked for more money (Anthropic)

      Once this goes live, provided Siri improves meaningfully, it is quite an expensive experiment to then switch to a different provider.

      For any single user, the switching costs to a different LLM are next to nothing. But at Apple's scale they need to be extremely careful and confident that the switch is an actual improvement

      21 replies →

    • > what their actual 5-, 10-, and 20-year plans are

      Seems like they are waiting for the "slope of enlightenment" on the gartner hype curve to flatten out. Given you can just lease or buy a SOTA model from leading vendors there's no advantage to training your own right now. My guess is that the LLM/AI landscape will look entirely different by 2030 and any 5 year plan won't be in the same zip code, let alone playing field. Leasing an LLM from Google with a support contract seems like a pretty smart short term play as things continue to evolve over the next 2-3 years.

      9 replies →

    • That's not an "obligatory HN dig" though, you're in-media-res watching X escape removal from the App Store and Play Store. Concepts like privacy, legality and high-quality software are all theater. We have no altruists defending these principles for us at Apple or Google.

      Apple won't switch Google out as a provider for the same reason Google is your default search provider. They don't give a shit about how many advertisements you're shown. You are actually detached from 2026 software trends if you think Apple is going to give users significant backend choices. They're perfectly fine selling your attention to the highest bidder.

      10 replies →

    •           LLMs are now commodities and the least important component of the intelligence system Apple is building
      
      

      If that was even remotely true, Apple, Meta, and Amazon would have SoTA foundational models.

      2 replies →

  • An Apple-developed LLM would likely be worse than SOTA, even if they dumped billions on compute. They'll never attract as much talent as the others, especially given how poorly their AI org was run (reportedly). The weird secrecy will be a turnoff. The culture is worse and more bureaucratic. The past decade has shown that Apple is unwilling to fix these things. So I'm glad Apple was forced to overcome their Not-Invented-Here syndrome/handicap in this case.

    • Apple might have gotten very lucky here ... the money might be in finding uses, and selling physical products rather than burning piles of cash training models that are SOTA for 5 minutes before being yet another model in a crowded field.

      My money is still on Apple and Google to be the winners from LLMs.

      21 replies →

    • Reportedly, Meta is paying top AI talent up to $300M for a 4 year contract. As much as I'm in favor of paying engineers well, I don't think salaries like this (unless they are across the board for the company, which they are of course not) are healthy for the company long term (cf. Anthony Levandowski, who got money thrown after him by Google, only to rip them off).

      So I'm glad Apple is not trying to get too much into a bidding war. As for how well orgs are run, Meta has its issues as well (cf the fiasco with its eponymous product), while Google steadily seems to erode its core products.

      1 reply →

  • Is the training cost really that high, though?

    The Allen Institute (a non-profit) just released the Molmo 2 and Olmo 3 models. They trained these from scratch using public datasets, and they are performance-competitive with Gemini in several benchmarks [0] [1].

    AMD was also able to successfully train an older version of OLMo on their hardware using the published code, data, and recipe [2].

    If a non-profit and a chip vendor (training for marketing purposes) can do this, it clearly doesn't require "burning 10 years of cash flow" or a Google-scale TPU farm.

    [0]: https://allenai.org/blog/molmo2

    [1]: https://allenai.org/blog/olmo3

    [2]: https://huggingface.co/amd/AMD-OLMo

    • No, of course the training costs aren't that high. Apple's ten years of future free cash flow is greater than a trillion dollars (they are above $100b per year). Obviously, the training costs are a trivial amount compared to that figure.

      7 replies →

    • No, I doesn't beat Gemini in any benchmarks. It beats Gemma, which isn't a SoTA even among open models of that size. That would be Nemotron 3 or GPT-OSS 20B.

  • Yea, I think it’s smart, too. There are multiple companies who have spent a fortune on training and are going to be increasingly interested in (desperate to?) see a return from it. Apple can choose the best of the bunch, pay less than they would have to to build it themselves, and swap to a new one if someone produces another breakthrough.

    • 100%. It feels like Apple is perfectly happy letting the AI labs fight a race to the bottom on pricing while they keep the high-margin user relationship.

      I'm curious if this officially turns the foundation model providers into the new "dumb pipes" of the tech stack?

      5 replies →

  • > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

    This sort of thing didn't work out great for Mozilla. Apple, thankfully, has other business bringing in the revenue, but it's still a bit wild to put a core bit of the product in the hands of the only other major competitor in the smartphone OS space!

    • I dunno, my take is that Apple isn’t outsourcing intelligence rather it’s outsourcing the most expensive, least defensible layer.

      Down the road Apple has an advantage here in a super large training data set that includes messages, mail, photos, calendar, health, app usage, location, purchases, voice, biometrics, and you behaviour over YEARS.

      Let's check back in 5 years and see if Apple is still using Gemini or if Apple distills, trains and specializes until they have completed building a model-agnostic intelligence substrate.

  • > The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

    It goes back much further than that - up until 2016, Apple wouldn't let its ML researchers add author names to published research papers. You can't attract world-class talent in research with a culture built around paranoid secrecy.

    • > You can't attract world-class talent in research with a culture built around paranoid secrecy.

      Would giving more money/shares help?

  • Seems like there is a moat after all.

    The moat is talent, culture, and compute. Apple doesn't have any of these 3 for SOTA AI.

  • I always think about this, can someone with more knowledge than me help me understand the fragility of these operations?

    It sounds like the value of these very time-consuming, resource-intensive, and large scale operations is entirely self-contained in the weights produced at the end, right?

    Given that we have a lot of other players enabling this in other ways, like Open Sourcing weights (West vs East AI race), and even leaks, this play by Apple sounds really smart and the only opportunity window they are giving away here is "first to market" right?

    Is it safe to assume that eventually the weights will be out in the open for everyone?

    • > and the only opportunity window they are giving away here is "first to market" right?

      A lot of the hype in LLM economics is driven by speculation that eventually training these LLMs is going to lead to AGI and the first to get there will reap huge benefits.

      So if you believe that, being "first to market" is a pretty big deal.

      But in the real world there's no reason to believe LLMs lead to AGI, and given the fairly lock-step nature of the competition, there's also not really a reason to believe that even if LLMs did somehow lead to AGI that the same result wouldn't be achieved by everyone currently building "State of the Art" models at roughly the same time (like within days/months of each other).

      So... yeah, what Apple is doing is actually pretty smart, and I'm not particularly an Apple fan.

    • > is entirely self-contained in the weights produced at the end, right?

      Yes, and the knowledge gained along the way. For example, the new TPUv4 that Google uses requires rack and DC aware technologies (like optical switching fabric) for them to even work at all. The weights are important, and there is open weights, but only Google and the like are getting the experience and SOTA tech needed to operate cheaply at scale.

  • Apple's goal is likely to run all inference locally. But models aren't good enough yet and there isn't enough RAM in an iPhone. They just need Gemini to buy time until those problems are resolved.

    • Phones will get upgrades, but then so will servers. The local models will always be behind the state of the art running on big iron. You can’t expect to stand still and keep up with the Red Queen.

    • That was their goal, but in the past couple years they seem to have given up on client-side-only ai. Once they let that go, it became next to impossible to claw back to client only… because as client side ai gets better so does server side, and people’s expectations scale up with server side. And everybody who this was a dealbreaker for left the room already.

      5 replies →

    • Well DRAM prices aren't going down soon so I see this as quite the push away from local inference.

  • 10 years worth of cash? So all these Chinese labs that came out and did it for less than $1 billion must have 3 heads per developer, right?

    • Rumor has it that they weren't trained "from scratch" the was US would, i.e. Chinese labs benefitted from government "procured" IP (the US $B models) in order to train their $M models. Also understand there to be real innovation in the many-MoE architecture on top of that. Would love to hear a more technical understanding from someone who does more than repeat rumors, though.

    • We don't really know how much it cost them. Plenty of reasons to doubt the numbers passed around and what it wasn't counting.

      (And even if you do believe it, they also aren't licensing the IP they're training on, unlike american firms who are now paying quite a lot for it)

    • A lot of HN commentators are high on their own supply with regard to the AI bubble... when you realize that this stuff isn't actually that expensive the whole thing begins to quickly unravel.

  • It also lets them keep a lot of the legal issues regarding LLM development at arms length while still benefiting from them.

  • > Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence.

    They have always been a premium "last mile" delivery network for someone else's intelligence, except that "intelligence" was always IP until now. They have always polished existing (i.e., not theirs) ideas and made them bulletproof and accessible to the masses. Seems like they intend to just do more of the same for AI "intelligence". And good for them, as it is their specialty and it works.

  • Could you elaborate a bit on why you've judged it as privacy theatre? I'm skeptical but uninformed, and I believe Mullvad are taking a similar approach.

    • Mullvad is nothing like Apple. For apple devices: - need real email and real phone number to even boot the device - cannot disable telemetry - app store apps only, even though many key privacy preserving apps are not available - /etc/hosts are not your own, DNS control in general is extremely weak - VPN apps on idevices have artificial holes - can't change push notification provider - can only use webkit for browsers, which lacks many important privacy preserving capabilities - need to use an app you don't trust but want to sandbox it from your real information? Too bad, no way to do so. - the source code is closed so Apple can claim X but do Y, you have no proof that you are secure or private - without control of your OS you are subject to Apple complying with the government and pushing updates to serve them not you, which they are happy to do to make a buck

      Mullvad requires nothing but an envelope with cash in it and a hash code and stores nothing. Apple owns you.

      7 replies →

    • They transitioned from “nobody can read your data, not even Apple” to “Apple cannot read your data.” Think about what that change means. And even that is not always true.

      They also were deceptive about iCloud encryption where they claimed that nobody but you can read your iCloud data. But then it came out after all their fanfare that if you do iCloud backups Apple CAN read your data. But they aren’t in a hurry to retract the lie they promoted.

      Also if someone in another country messages you, if that country’s laws require that Apple provide the name, email, phone number, and content of the local users, guess what. Since they messaged you, now not only their name and information, but also your name and private information and message content is shared with that country’s government as well. By Apple. Do they tell you? No. Even if your own country respects privacy. Does Apple have a help article explaining this? No.

      4 replies →

    • Because Apple makes privacy claims all the time, but all their software is closed source and it is very hard or impossible to verify any of their claims. Even if messages sent between iPhones are E2EE encrypted for example, the client apps and the operating system may be backdoored (and likely are).

      https://en.wikipedia.org/wiki/PRISM

  • It’s also a bet that the capex cost for training future models will be much lower than it is today. Why invest in it today if they already have the moat and dominant edge platform (with a loyal customer base upgrading hardware on 2-3 year cycles) for deploying whatever future commoditized training or inference workloads emerge by the time this Google deal expires?

  • Personally also think it's very smart move - Google has TPUs and will do it more efficiently than anyone else.

    It also lets Apple stand by while the dust settles on who will out innovate in the AI war - they could easily enter the game on a big way much later on.

  • Seems like the LLM landscape is still evolving, and training your own model provides no technical benefit as you can simply buy/lease one, without the overhead of additional eng staffing/datacenter build-out.

    I can see a future where LLM research stalls and stagnates, at which point the ROI on building/maintaining their own commodity LLM might become tolerable. Apple has had Siri as a product/feature and they've proven for the better part of a decade that voice assistants are not something they're willing to build a proficiency in. My wife still has an apple iPhone for at least a decade now, and I've heard her use Siri perhaps twice in that time.

    • And if you wanted to build your own data center right now there’s only so much GPU and RAM to go around, and even all the power generation and cooling manufacturers are booked solid.

  • The trouble is this seems to me like a short term fix, longer term, once the models are much better, Google can just lock out apple and take everything for themselves and leave Apple nowhere and even further behind.

    • Of course there is going to be an abstraction layer - this is like Software Engineering 101.

      Google really could care less about Android being good. It is a client for Google search and Google services - just like the iPhone is a client for Google search and apps.

  • > without burning 10 years of cash flow

    AAPL has approximately $35 billion of cash equivalents on hand. What other use may they have for this trove? Buy back more stocks?

  • > They simply do not have the TPU pods or the H100 clusters to train a frontier model like Gemini 2.5 or 3.0 from scratch without burning 10 years of cash flow.

    Why does Apple need to build its own training cluster to train a frontier model, anyway?

    Why couldn't the deal we're reading about have been "Apple pays Google $200bn to lease exclusive-use timeslots on Google's AI training cluster"?

    • That would be more expensive in the long run and Apple is all about long game

  • Agreed, especially since this is a competitive space with multiple players, with a high price of admission, and where your model is outdated in a year, so its not even capex as much as recurring expenditure. Far better to let someone else do all the hard work, and wait and see where things go. Maybe someday this'll be a core competency you want in-house, but when that day comes you can make that switch, just like with apple silicon.

  • > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain

    I feel like people probably said this when Google became the default search engine for everyone...

  • Apple sells consumer goods first and foremost. They likely don't see a return on investment through increased device or services sales to match the hundreds of billions that these large AI companies are throwing down every year.

  •   > without burning 10 years of cash flow.
    

    Sorry to nitpick but Apple’s Free Cash Flow is 100B/yr. Training a model to power Siri would not cost more than a trillion dollars.

    • Of all the companies to survive a crash in AI unscathed, I would bet on Apple the most.

      They are only ones who do not have large debts off(or on) balance sheet or aggressive long term contracts with model providers and their product demand /cash flow is least dependent on the AI industry performance.

      They will still be affected by general economic downturn but not be impacted as deeply as AI charged companies in big tech.

  • >Am I missing the elephant in the room?

    Everyone using Siri is going to have their personality data emulated and simulated as a ”digital twin” in some computing hell-hole.

  • >Apple has the best edge inference silicon in the world (neural engine),

    Can you cite this claim? The Qualcomm Hexagon NPU seems to be superior in the benchmarks I've seen.

  • > without burning 10 years of cash flow.

    Don't they have the highest market cap of any company in existence?

    • They have the largest free cash flow (over $100 billion a year). Meta and Amazon have less than half that a year, and Microsoft/Nvidia are between $60b-70b per year. The statement reflects a poor understanding of their financials.

  • the year is 2026, the top advertising company is in bed with the walled garden device specialists and the decision is celebrated

  • this also addresses something else ...

    apple to some users "are you leaving for android because of their ai assistant? don’t leave we are bringing it to iphone"

  • > To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

    Probably not missing the elephant. They certainly have the money to invest and they do like vertical integration but putting massive investment in bubble that can pop or flatline at any point seems pointless if they can just pay to use current best and in future they can just switch to something cheaper or buy some of the smaller AI companies that survive the purge.

    Given how much AI capable their hardware is they might just move most of it locally too

  • Honestly, I'm relieved...it's not really in their DNA and not pivotal to their success; why pivot the company into a U turn into a market that's vague defined and potentially algorithmically limited?

  • > without burning 10 years of cash flow.

    Wasn't Apple sitting on a pile of cash and having no good ideas what to spend it on?

    • Perhaps spending it on inference that will be obsoleted in 6 months by the next model is not a good idea either.

      Edit: especially given that Apple doesn’t do b2b so all the spend would be just to make consumer products

      1 reply →

    • The cash pile is gone, they have been active in share repurchase.

      They still generate about ~$100 billion in free cash per year, that is plowed into the buybacks.

      They could spend more cash than every other industry competitor. It's ludicrous to say that they would have to burn 10 years of cash flow on trivial (relative) investment in model development and training. That statement reflects a poor understanding of Apple's cash flow.

  • > Am I missing the elephant in the room?

    Apple is flush with cash and other assets, they have always been. They most likely plan to ride out the AI boom with Google's models and buy up scraps for pennies on the dollar once the bubble pops and a bunch of the startups go bust.

    It wouldn't be the first time they went for full vertical integration.

  • calling neural engine the best is pretty silly. the best perhaps of what is uniformly a failed class of ip blocks - mobile inference NPU hardware. edge inference on apple is dominated by cpus and metal, which don't use their NPU.

Apple has seemingly confirmed that the Gemini models will run under their Private Cloud Compute and so presumably Google would not have access to Siri data.

https://daringfireball.net/linked/2026/01/12/apple-google-fo...

  • Neither Apple's nor Google's announcement says Siri will use Gemini models. Both announcements say, word for word, "Google’s technology provides the most capable foundation for Apple Foundation Models". I don't know what that means, but Apple and Google's marketing teams must have crafted that awkward wording carefully to satisfy some contractual nuance.

    • Direct quote from Google themselves:

      "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."

      5 replies →

    • Apple likely wants to post-train a per-trained model, probably along with some of Google's heavily NDA'ed training techniques too.

    • > "Google’s technology provides the most capable foundation for Apple Foundation Models"

      Beyond Siri, Apple Foundation Models are available as API; will Google's technologies thus also be available as API? Will Apple reduce its own investment in building out the Foundation models?

    • Mostly likely the wording was crafted by an artificially intelligent entity.

This is a bit of a layer cake:

1. The first issue is that there is significant momentum in calling Siri bad, so even if Apple released a higher quality version it will still be labelled bad. It can enhance the user's life and make their device easier to use, but the overall press will be cherrypicked examples where it did something silly.

2. Basing Siri on Google's Gemini can help to alleviate some of that bad press, since a non-zero share of that doomer commentary comes from brand-loyalists and astroturfing.

3. The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm. To help illustrate that point: We even have the likes of John Gruber making stony-faced comparisons between Apple's on-device image generator toy (one that produces about an image per second) versus OpenAI's server farm-based image generator which makes a single image in about 1-2 minutes. So if a long-running tech blogger can't find charity in those technical limitations, I don't expect users to.

  • Siri is objectively bad though. It isn't some vendetta. I am disabled and there are at least 50 different things that I'd love siri to do that should be dead simple, yet it cannot. My favorite one was when I suffered a small but not serious fall, decided to test whether siri could be alerted to call 9-11 while being less than 6 feet away from me, absolutely could not understand let alone execute my request. It's a lot of stuff like this. Its core functionality often just does not work.

    > The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm.

    For many years, siri requests were sent to an external server. It still sucked.

    • I don't think the parent said that Siri wasn't bad, on the contrary it sounds like they agree.

      Their point is that if Apple totally scraps the current, bad, product called "Siri" and replaces it with an entirely different, much better product that is also named "Siri" but shares nothing but the name, people's perceptions of the current bad Siri will taint their impressions of the new one.

      4 replies →

    • > Hey Siri, call me an ambulance!

      > Alright, from now on I will call you Anne Ambulance.

    • I'd be skeptical about even new LLM siri being able to dial 911.

      These models tend to have a "mind of their own", and I can totally, absolutely, see a current SOTA LLM convincing itself it needs to call 911 because you asked it how to disinfect a cut.

      1 reply →

    • In the 15 years since I've been an Apple user, Siri has never worked for me when I really needed it.

  • There are many people who lament that Siri sucks but would be happy to admit if/when this changes. Even if it goes from super shitty (as evidenced by randomly calling people I have never called/texted when I ask it to call my wife) to "pretty good" I will be the first to admit that it is better. I look forward to it getting better and being able to use it more often.

Forget Siri, I have a much lower bar — I’ll be happy if they just improve iOS typing corrections/completions, which often don’t make any sense given the rest of the sentence.

  • Not just rest of the sentence. In my opinion, autocorrect desperately needs to take into account the context of the current screen.

    There are many times I want to type the same word that is already on the app screen but it autocorrects me to something completely different.

    • And they could add predictive text to other languages too, it's not rocket science.

      The current system suggests words I have never used, will never use and have never heard before instead of the obvious choice.

  • It's the same core issue which is basically that their software stacked sort of sucks. They should definitely wipe the slate clean when it comes to anything related to language, and that includes typing, text to speech, speech to text, agents, etc.

    They have the time and the money and the customers, so I'm confident they will accomplish great things.

  • I’ve been debating turning auto-correct off completely. However, the first iPhone had it, so I’m guessing I want some level of it. I just don’t understand on v1 was better than what we have 18 years later.

Related: Apple nears $1B Google deal for custom Gemini model to power Siri (71 points, 2 months ago, 47 comments) https://news.ycombinator.com/item?id=45826975

  • The biggest NEW thing here is that this isn't white-labeled. Apple is officially acknowledging Google as the model that will be powering Siri. That explicit acknowledgment is a pretty big deal. It will make it harder for Apple to switch to its own models later on.

    • Where does it say that it won't be white-labeled?

      Yes, Apple is acknowledging that Google's Gemini will be powering Siri and that is a big deal, but are they going to be acknowledging it in the product or is this just an acknowledgment to investors?

      Apple doesn't hide where many of their components come from, but that doesn't mean that those brands are credited in the product. There's no "fab by TSMC" or "camera sensors by Sony" or "display by Samsung" on an iPhone box.

      It's possible that Apple will credit Gemini within the UI, but that isn't contained in the article or video. If Apple uses a Gemini-based model anonymously, it would be easy to switch away from it in the future - just as Apple had used both Samsung and TSMC fabs, or how Apple has used both Samsung and Japan Display. Heck, we know that Apple has bought cloud services from AWS and Google, but we don't have "iCloud by AWS and GCP."

      Yes, this is a more public announcement than Apple's display and camera part suppliers, but those aren't really hidden. Apple's dealings with Qualcomm have been extremely public. Apple's use of TSMC is extremely public. To me, this is Apple saying "hey CNBC/investors, we've settled on using Gemini to get next-gen Siri happening so you all can feel safe that we aren't rudderless on next-gen Siri."

      7 replies →

    • I don't see why - iOS originally shipped with Google Maps as standard, for example. Macs shipped with Internet Explorer as standard before Safari existed

      22 replies →

    • Don't think that's an especially big deal, they've always included third party data in Siri or the OS which is usually credited (Example: Maps with Foursquare or TomTom, Flight information from FlightAware, Weather data and many more).

    • They can also put "Google" in the forever-necessary disclaimer

      Google AI can make mistakes

  • Is this another one of those AI deals where no real money changes hands? In this case, doesn't this just offset the fee Google pays Apple for having their search as the default on Apple devices?

    • I'll wager the accounting for the two contracts is separate. There may be stipulations that connect the two, but the payment from Google to Apple of $20B+/yr is a long-established contract (set of contracts, actually0 that Apple would not jeopardize for the relatively small Google to Apple $1B/yr contract, one still unproven and which may not stand the test of time.

      So, yes, practically speaking, the Apple to Google payment offsets a tiny fraction of the Google to Apple payment, but real money will change hands for each and very likely separately.

OpenAI had it, they had the foot in the door with their integration last year with Siri. But they dropped that ball and many other balls.

  • Yeah, I was really expecting them to just continue the partnership that Apple announced when the iPhone 16/iOS 18 came out, but I suppose it's been pretty much radio silence on both fronts since then. Although the established stability and good enough-ness that Google offers with Gemini are probably more than enough reason for Apple to pivot to them as a model supplier instead.

  • Yeah. Super disappointing. I may end up switching to Gemini entirely at this rate.

Models are becoming commodities, and their economy doesn't justify the billions required to train a SOTA model. Apple just recognized that.

  • Models are becoming less like commodities. They're differentiating with strengths and weaknesses. When Chinese labs gain more traction, they will stop releasing their models for free. At that point, everyone who wants SOTA models will have to pay.

    • Having to pay has nothing to do with a good being a commodity. I have to pay for sugar, but there is no big difference between brands that justify any of them commanding a monopoly rent, so, sugar is a commodity. The same is more or less true of LLMs right now and unless someone comes up with a new paradigm beyond the transformers architecture, there is no reason to believe this commodification trend is going to be reversed.

      Most of the differentiation is happening on the application/agent layer. Like Coworker.

      The rest of it, is happening on post-training. Incremental changes.

      We are not talking about EUV lithography here. There are no substantial moths of years of pure and applied research protected by patents.

      4 replies →

This is one of those announcements that actually just excites me as a consumer. We give our children HomePods as their first device when they turn 8 years old (Apple Watch at 10 years, laptop at 12) and in the 6 years I have been buying them, they have not improved one ounce. My kids would like to listen to podcasts, get information, etc. All stuff that a voice conversation with Chatgpt or Gemini can do today, but Siri isn't just useless-- it's actually quite frustrating!

  • Siri still can't play an Apple Music album when there is a song of the same name.

    Even "Play the album XY" leads to Siri only playing the single song. It's hilariously bad.

    • Or the even more frustrating:

      Me: "Hey Siri, play <well known hit song from a studio album that sold 100m copies"

      Siri: "OK, here's <correct song but a live version nobody ever listens to, or some equally obscure remix>"

      Being these things are at their core probability machines, ... How? Why?

      1 reply →

  • It’s absolutely insane that you can’t say “Siri, play my audiobook” and it play the last audiobook you listened to. Like, come on.

    • Or when you are driving, someone sends a yes-no question where the answer is no.

      Siri: Would you like to answer?

      Me: Yes

      Siri: ...

      Me: No + more words

      Siri: Ok (shuts off)

  • Not exactly the same, but kinda: my gen 1 Google Home just got Gemini and it finally delivers on the promise of like 10 years ago! Brought new life to the thing beyond playing music, setting timers, and occasionally asking really basic questions

  • It remains to be seen what the existing HomePods will support. There’s been a HomePod hardware update in the pipeline for quite some time, and it appears like they are waiting for the new Siri to be ready.

  • it's not going to help them. For Siri to be really useful it wouldn't need deep system integration and an external model is not going to provide that. People don't believe me when I said it about Apple Intelligence with open AI

  • Thats what you get for buying into one ecosystem and sticking with it. All that stuff has been available on Alexa for a decade.

I think this is a good move for Apple. It avoids tying them directly to internalised beliefs in their own AI model, it avoids all the capex around building out an AI engine and associated DC, it reduces risk, and it keeps google in a relationship under contract which google will value, and probably value enough to think hard about stupid legal games regarding Playstore and walled gardens.

Apple plainly doesn't believe in the uplift and impending AGI doom. Nor do they believe there's no value in AI services. They just think for NOW at least they can buy in better than they can own.

But based on Apples VLSI longterm vision, on their other behaviours in times past with IPR in any space, they will ultimately take ownership.

  • > they will ultimately take ownership.

    How? People have been saying this since CoreML dropped nine years ago. Apple is no closer to revamping Siri or rebuking CUDA than they were back then.

    • When the cost of deployment drops. And, when their own chip designs are profitable, they'll take the capital hit. Until then, as long as the income split for Gemini backed siri isn't terrible they'll stay an outsource. If they persuade Google to deploy Apple chips into the service I'd say an in-house is within sight.

      Apple private relay runs in cloudflare and fastly and I believe one other major. They certainly can and do run services for a long time with partners.

      1 reply →

This is actually a smart and common sense move by Apple.

The non-hardware AI industry is currently in an R&D race to establish and maintain marketshare, but with Apple's existing iPhone, iPad and Mac ecosystem they already have a market share they control so they can wait until the AI market stabilizes before investing heavily in their own solutions.

For now, Apple can partner with solid AI providers to provide AI services and benefits to their customers in the short term and then later on they can acquire established AI companies to jumpstart their own AI platform once AI technology reaches more long term consistency and standardization.

> After careful evaluation, we determined that Google’s technology provides the most capable foundation for Apple Foundation Models

Sounds like Apple Foundation Models aren't exactly foundational.

Somewhat surprising. AI is such a core part of the experience. It feels like a mistake to outsource it to arguably your biggest competitor.

  • It's clear they don't have the in-house expertise to do it themselves. They aren't an AI player. So it's not a mistake, just a necessity.

    Maybe someday they'll build their own, the way they eventually replaced Google Maps with Apple Maps. But I think they recognize that that will be years away.

    • Apple has been using ML in their products for years, to the point that they dedicated parts of their custom silicon for it before the LLM craze. They clearly have some in-house ML talent, but I suppose LLM talent may be a different question.

      I’m wondering if this is a way to shift blame for issues. It was mentioned in an interview that what they built internally wasn’t good enough, presumably due to hallucinations… but every AI does that. They know customers have a low tolerance for mistakes and any issues will quickly become a meme (see the Apple Maps launch). If the technology is inherently flawed, where it will never live up to their standards, if they outsource it, they can point to Google as the source of the failings. If things get better down the road and they can improve by pivoting away from Google, they’ll look better and it will make Google look bad. This could be the long game.

      They may also save a fortune in training their own models, if they don’t plan to directly try to monetize the AI, and simply have it as a value add for existing customers. Not to mention staying out of hot water related to stealing art for training data, as a company heavily used by artists.

    • I agree that they don't appear poised to do it themselves. But why not work with Meta or OpenAI (maybe a bit more questionable with MS) or some other player, rather than Google?

      2 replies →

  • > AI is such a core part of the experience

    For who? Regular people are quite famously not clamouring for more AI features in software. A Siri that is not so stupendously dumb would be nice, but I doubt it would even be a consideration for the vast majority of people choosing a phone.

  • They could use it like Google Search, not as the first thing the user sees, but as a fallback

  • Web search is a core part of browsing and Apple is Google's biggest competitor in browsers. Google is paying Apple about 25x for integrating Google Search in Safari as Apple will be paying Google to integrate Google's LLMs into Siri. If you think depending on your competitor is a problem, you should really look into web search where all the real money is today.

Personally for me, who is bought into the Apple ecosystem this is worrying. I am aware how PCC is supposed to work (which is the likely target platform) but the deal with Google of all the companies sends bad signal to consumers who are privacy focussed. If such a feature will be baked in without a way to switch it off, the next device will not be iphone or macbook or ipad.

  • You could disable Siri since the very beginning, why would all of a sudden it would not have the toggle to disable.

What happens to on-device intelligence? Does it stay a massive part of the Apple Intelligence offer? Or can we expect everything to be offloaded to the cloud?

  • Apple seems positioning this announcement as "on-device intelligence", with the caveat that Apple is promoting their Private Cloud Compute as "on-device" or at least "on-device-like". I'd be curious if they do a breakdown of what they expect to do on-device versus what happens in Private Cloud Compute for this Siri project. I'm a little on the fence if Private Cloud Compute counts as "on-device" as well, but I'm hopeful it is a good idea and is as well built/considered as its documentation says it is.

My experience with Gemini (3 Flash) has been pretty funny, not awful (but worse than Kimi K2 or GPT 5.2 Mini), but it's just so much worse at (or rather hyper focused on) following my custom instructions, I keep getting responses like:

    The idiomatic "British" way of doing this ...

    Alternatively, for an Imperial-style approach, ...

    As a professional software engineer you really should ...

in response to programming/Linux/etc. questions!

(Because I just have a short blurb about my educational background, career, and geography in there, which with every other model I've tried works great to ensure British spelling, UK information, metric units, and cut the cruft because I know how to mkdir etc.)

It's given me a good laugh a few times, but just about getting old now.

Old news now I think, but good news. Except for my Apple Watch I have given up using Siri, but I use Gemini and think it is good in general, and awesome on my brother's Pixel phone.

Because Apple Silicon is so good for LLM inferencing, I hope they also do a deal for small on-device Gemma models.

I thought it was interesting that a Google flack stressed that the model would run on Apple's compute, and seemed to imply it might even run on-device. Allegedly this was said to allay the (expected) privacy concerns of Apple users who wouldn't want their Siri convos shared with Google.

But I saw something else in that statement. Is there going to be some quantized version of Gemini tailored to run on-device on an M4? If so, that would catapult Apple into an entirely new category merging consumer hardware with frontier models.

  • You can already run quantized models without much friction, people also have dedicated apps for that. It changes very little for people because they everyone who wanted to do it already solved it and those who do not they dont care. It is marginal gain from consumer, a feature to brag about for apple, big gain for google. Users also would need to change existing habits which is undoubtedly hard to do.

They already use GCP for storage so I guess there is some precedent for big ties between them

Really, Siri is an agent. Agents thrive when the subjacent model capabilities are higher, as it unlocks a series of other use cases that are hard to accomplish when the basic Natural Language Processing layer is weak.

The better the basic NLP tasks like named entity recognition, PoS tagging, Dependency Parsing, Semantic Role Labelling, Event Extraction, Constituency parsing, Classification/Categorization, Question Answering, etc, are implemented by the model layer, the farther you can go on implementing meaningful use-cases in your agent.

Apple can now concentrate on making Siri a really useful and powerful agent.

This is good for Siri, in many ways. But I was kind of hoping we would see a time soon when phone hardware became good enough to do nearly 100% of the Siri-level tasks locally rather than needing Internet access.

  • I suspect we'll see that; but Siri is in such a bad state of disrepair that Apple really needs something now while they continue to look for micro-scale LLM models that can run well-enough locally. The two things aren't mutually exclusive.

    The biggest thing Apple has to do is get a generic pipeline up and running, that can support both cloud and non-cloud models down the road, and integrate with a bunch of local tools for agent-style workloads (e.g. "restart", "audio volume", "take screenshot" as tools that agents via different cloud/local models can call on-device).

  • I don’t think there’s a clear boundary of “Siri-level” tasks. In particular, properly determining whether a task is “Siri-level” or not is likely to require off-device AI.

    • I'd hope it could be the other way around. Some stuff should be relatively straightforward -- summarizing notifications, emails, setting timers, things like that should be obviously on-device. But aside from that, I would hope that the on-device AI can make the determination on whether it is necessary to go to a datacenter AI for a better answer.

      But you may be right, maybe on-device won't be smart enough to decide it isn't smart enough. Though it does seem like the local LLMs have gotten awfully good.

      1 reply →

The actual transactions around this deal will be interesting - will Google simply withold $1B from their search deal, will they pay it then Applepay it back (or a split). I doubt we’ll even know.

Can someone explain to me how this was allowed to happen? Wasn't Siri supposed to be the leading AI agent not ten years ago? How was there such a large disconnect at Apple between what Siri could do and what "real" AI was soon to be capable of?

Was this just a massive oversight at Apple? Were there not AI researchers at Apple sounding the alarm that they were way off with their technology and its capabilities? Wouldn't there be talk within the industry that this form of AI assistant would soon be looked at as useless?

Am I missing something?

  • Source: while I don’t have any experience with the inner workings of Siri, I have extensive experience with voice based automation with call centers (Amazon Connect) and Amazon Lex (the AWS version of Alexa).

    Siri was never an “AI agent”, with intent based systems, you give the system phrases to match on (intents) and to fulfill an intent, all of the “slots” have to be fulfilled. For instance “I want to go from $source to $destination” and then the system calls an API.

    There is no AI understanding - it’s a “1000 monkeys implementation”, you just start giving the system a bunch of variations and templates you want to match on in every single language you care about and match the intents to an API. That’s how Google and Alexa also worked pre LLM. They just had more monkeys dedicated to creating matching sentences.

    Post LLM, you tell the LLM what the underlying system is capable of, the parameters the API requires to fulfill an action and the LLM can figure out the users intentions and ask follow up questions until it had enough info to call the API. You can specify the prompt in English and it works in all of the languages that the LLM has been trained on.

    Yes I’ve done both approaches

    • I appreciate the response, but that doesn't really answer my question.

      I want to know why the executive leadership at Apple failed to see LLMs as the future of AI. ChatGPT and Gemini are what Siri should be at this point. Siri was one of the leading voice-automated assistants of the past decade, and now Apple's only options are to strap on an existing solution to the name of their product or let it go defunct. So now Siri is just an added layer to access Gemini? Perhaps with a few hard-coded solutions to automate specific tasks on the iPhone, and that's their killer app into the world of AI? That's pathetic.

      Is Apple already such a bloated corporation that it can no longer innovate fast enough to keep up with modern trends? It seems like only a few years ago they were super lean and able to innovate better than any major tech company around. LLMs were being researched in 2017. I guess three years was too short of a window to change the direction of Siri. They should have seen the writing on the wall here.

      6 replies →

To me, Apple's ML business was all about federated learning. I know this concept was pre-transformer era, I conjectured one of the reasons Apple didn't adopt LLM right away was that Apple couldn't find a reasonable way to do federated learning with LLMs. I wonder Apple will give up this idea. And I would like to see how it could be done with current AI systems.

Makes sense given the search alliance already in place.

Amazon/AWS was trying to push its partnership with Apple hard once that was revealed, including vague references to doing AI things, but AWS is just way to far behind at this point so looks like they lost out here to Google/GCP.

They've let a snake in their walled garden.

I didn't realize that Apple could possibly be more stupid in their strategy with AI, but now they've given the game to their biggest competitor in every arena in which they compete.

It's truly amazing how badly they've flubbed it.

What I want to know is the privacy impact of this partnership. I see terms like "Apple will be running Google's models on their infrastructure" but that definitely is not enough detail for me to know where my data is going.

Any details on privacy and data sharing surfaced yet?

Google release hints at this being more than just Siri:

> Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.

... https://blog.google/company-news/inside-google/company-annou...

  • This is actually the most important part of this announcement, and excellent news. I was pretty disappointed that they were going with an existing player rather than building their own models. But this implies that they will continue to build their own base models, just using Gemini as a starting point, which is a pretty good solution.

This is the sad state of Apple right now. It is ridiculous that they basically had unlimited access to TSMC and achieved nothing in AI. Management is a joke.

  • During his tenure as CEO, Tim Cook has added $700 million per day in enterprise value to Apple. Per day! For 14 years!

    • I was not aware HN is now an investment discussion board. Even if you were to argue that point, what’s his incremental value comped to but-for world? I mean one where Steve Jobs is still alive and running Apple. I am sure Jobs would’ve sat on his behind milking iPhones and just let Google, Microsoft, Meta and Nvidia take the entire AI TAM. I am sure that’s the Steve Jobs we all knew.

      1 reply →

Feel like this is a huge whiff. At best your implementation is as good as what you can get with Google's own offerings and Pixel. Most likely Apple's offerings will always be just a bit behind. Im probably a bit biased as Ive preferred Anthropic but it seems like those two companies also align more with other outward policies like privacy.

Does anyone know what Apple's "Private Cloud Compute" servers actually are? I recall murmurings about racked M chips or some custom datacenter-only variant?

I'm really curious how Apple is bridging the gap between consumer silicon and the datacenter scale stack they must have to run a customized Gemini model for millions of users.

RDMA over Thunderbolt is cool for small lab clusters but they must be using something else in the datacenter, right?

  • Apple is one of the largest customers of Google Cloud Platform.

    Last I heard most of their e2e storage for iCloud was on GCP.

    They also use AWS.

I don't understand why Apple cannot implement their own LLM at the user phone level for easy pickings? like settings control? or app-specific shortcuts? or local data searching?

I understand other things like image recognition, wikipedia information, etc require external data sets, and transferring over local data to that end can be a privacy breach. But the local stuff should be easy, at least in one or two languages.

  • All signs are that they are doing exactly that. They already have an on-device LLM which powers certain features, and I expect they will have a better-trained version of that on-device model that comes out with the "new Siri" update.

    In the original announcement of the Siri revamped a couple of years ago, they specifically talked about having the on-device model handle everything it can, and only using the cloud models for the harder or more open ended questions.

I wonder if we will see them take the final step and just make Gemini the default AI assistant on iPhone.

Might sound crazy but remember they did exactly this for web search. And Maps as well for many years.

This way they go from having to build and maintain Siri (which has negative brand value at this point) and pay Google's huge inference bills to actually charging Google for the privilege.

This morning I was wondering what happened to whatever arrangement I thought Apple had with OpenAI. In a way I think OpenAI is a competitor and “new money”. Pairing with Google makes sense especially considering that this is “normie-facing” technology. And from what I recall, a lot of Apple fans prefer “Hey Google” in their cars over CarPlay. Or something to that effect.

I’m a long time Android user and almost switched to iPhone last year. Mostly because I use macOS and wanted better integration and also wanted to try it. Another big factor was the AI assistant. I stayed with Android because I think Google will win here. Apple will probably avoid losing users to their biggest competitor by reaching rough parity using the same models

I think Apple is also more comfortable with Google than say an OpenAI due to the past relationship with the search deal.

So what has Apple done for AI in the last few years? Anything? Such a huge company

I’m hopeful this means Apple will finally ship an updated (i.e. actually useful) Siri this calendar year. I have to assume they’ve been integrating Gemini into their OSes for some time now and that this is just the announcement that makes it official.

So this surely means that in the medium term Google will siphon off all of the iCloud data. A dark pattern here, a new EULA popup for the user to accept there, and just like with copilot on windows the users will "allow" Apple to share all data with Google.

  • I wouldn't expect this to happen, as Apple's resistance against this would be too strong. The data of Google's paying [enterprise] customers stays private as well, so the safeguards are in place already.

"Google already pays Apple billions each year to be the default search engine on iPhones. But that lucrative partnership briefly came into question after Google was found to hold an illegal internet search monopoly.

In September, a judge ruled against a worst-case scenario outcome that could have forced Google to divest its Chrome browser business.

The decision also allowed Google to continue to make deals such as the one with Apple."

How much is Google paying Apple now

If these anti-competitive agreements^1 were public,^2 headlines could be something like,

(A) "Apple agrees to use Google's Gemini for AI-powered Siri for $[payment amount]"

Instead, headlines are something like,

(B) "Apple picks Google's Gemini to run Ai-powered Siri"

1. In other words, they are exclusive and have anticompetitive effects

2. Neither CNBC nor I are suggesting that there is any requirement for the parties to make these agreements public. I am presenting a hypothetical relating to headlilnes, (A) versus (B), as indicated by the words "If" and "could"

Why is Apple's self-developed AI progressing so slowly that they still need to collaborate with Google? What's going on with their AI team? In this era of AI, it seems like Apple has already fallen behind.

  • They let others run that costly costly costly race. It's not where Apple is most competitive.

Why is Apple's self-developed AI progressing so slowly that they need to collaborate with Google? What's going on with their AI team? In this era of AI, it seems like Apple has already fallen behind.

Apple is buying time for another shot, right now they are in over their heads on AI and they need a fresh start. Eventually they will own the tech from bottom up, that’s the goal at least

That's a shame. I was hoping for an Apple AI independent of advertiser influence.

Oh, well. What could have been great.

Is the era of Apple exceptionalism over? Has it been over for a while now?

Why they are constantly so bad at AI but so good at everything else?

  • Because their focus on user privacy makes it difficult for them to train at scale on users' data in the way that their competitors can. Ironically, this focus on privacy initially stemmed from fumbling the ball on Siri: recall that Apple never made privacy a core selling point until it was clear that Siri was years behind Google's equivalent, which Apple then retroactively tried to justify by claiming "we keep your data private so we can't train on it the way Google can." The result was a vicious cycle: initially botch AI rollout -> justify that failure with a novel marketing strategy around privacy that only makes it harder to improve their AI capabilities -> botch subsequent AI rollouts as a result -> ...

    To be clear, I'd much rather have my personal cloud data private than have good AI integration on my devices. But strictly from an AI-centric perspective, Apple painted themselves into a corner.

    • That's a poor justification. There are companies that sell you all kinds of labelled data. OpenAI, Anthropic didn't train on their own user data.

    • Apple's privacy focus started long before the current AI wave. It got major public attention in the fight with the FBI over unlocking the San Bernardino shooter's phone. I don't think Google's equivalent even existed at that point.

    • This is nonsense. You don't need Apple user data to build a good AI model, plenty of startups building base models have shown that. But even if you did it's nonsense as Apple has long had opt-in for providing data to train their machine learning models, and many of those models, like OCR or voice recognition, are excellent.

  • It's been a long running thing that Apple can't do software as well as competitors, though in my experience they've beat Google and a few others at devex and UX in their mobile frameworks overtime despite initial roughness. Slow and steady might win this race eventually, too.

  • it's pretty Apple-ish to not jump into a frenzy, and wait for turbulence to settle, i believe. delegation to Gemini fits that theory?

    • They've tried to have an AI assistant before AI was a big thing...it's just pretty bad and Siri never got better.

      If it would suddenly get better, like they teased (Some would say, lied about the capabilities) with Apple Intelligence that would fit pretty well. That they delegate that to Gemini now is a defeat.

  • They aren’t so good at everything else either.

    • I would not lure so much comments if not say this. Let's fish an answer from that pool of Apple's fanboys.

  • See ML research papers from Apple. Their researchers prefered small models over LLM. So they thought researchers' effort would make up the lack of compute. Then the scale law hit them hard.

  • Apple is almost purely customer products, they don't have the resources to compete with the giants in this field.

    Their image classification happens on-device, in comparison Google Photos does that server side so they already have ML infra.

  • I think that's the thing: Apple is good at very little, but they seem like they're good at "everything else" because they don't do much else. Lots of companies spread themselves really thin trying to get into lots of unrelated competencies and tons of products. Apple doesn't.

    Why does a MacBook seem better than PC laptops? Because Apple makes so few designs. When you make so few things, you can spend more time refining the design. When you're churning out a dozen designs a year, can you optimize the fan as well for each one? You hit a certain point where you say "eh, good enough." Apple's aluminum unibody MacBook Pro was largely the same design 2008-2021. They certainly iterated on it, but it wasn't "look at my flashy new case" every year. PC laptop makers come out with new designs with new materials so frequently.

    With iPhones, Apple often keeps a design for 3 years. It looks like Samsung has churned out over 25 phone models over the past year while Apple has 5 (iPhone, iPhone Plus, iPhone Pro, iPhone Pro Max, iPhone 16e).

    It's easy to look so good at things when you do fewer things. I think this is one of Apple's great strengths - knowing where to concentrate its effort.

    • This is some magical thinking. Even if Samsung took all their manpower, all their thought process and all their capital, they still couldn’t produce a laptop that competes with the MacBook (just to take one example), because they fundamentally don’t have any taste as a company.

      Hell, they can’t even make a TV this year that’s less shit than last years version of it and all that requires is do literally nothing.

      1 reply →

  • There's no reason to think that Apple would have any more skill at making a frontier AI model as they do at making airplanes or growing soybeans. Not much overlap between consumer electronics design and expertise, data, training, and datacenters needed for AI.

    • I feel like this ignores how big of a part the software is for those "consumer electronics" Apple is so good at making.

      Apple definitely has software expertise, maybe it's not as specialized into AI as it is about optimizing video or music editors, but to suggest they'd be at the same starting point as an agriculture endeavor feels dishonest.

I guess this is just a continuation of the Search deal, and an admission that LLMs are replacing search.

I can't wait for gemini to lecture me why I should throw away my android

I wonder if this will my original homepods interesting to talk to or if they won't provide this on older devices.

  • Not sure if that's too much of a crutch for you, but it's quite easy to create an "Ask Gemini" shortcut that calls a Cloud Function and returns a spoken response. I use this on my HomePods all the time, and it's working great.

Guess I am not using Siri anymore…

By the way, have any of you ever tried to delete and disabled Siri’s iCloud backup? You can’t do it.

  • Why not? Apple's ChatGPT integration has been pretty explicitly anonymizing requests and doesn't require an account. Maybe I'm missing something.

  • In the article they clearly mentioned that Gemini model will be used for the Foundation Model running on device or their own Server. They are not sending Siri request to Google servers.

  • You guys use Siri?

    • My exact reaction every time I hear people discuss Siri. I don’t think I used it once in my life and it’s one of the first thing I turn off every time I have a new device. So interesting to see how different people use the same devices in completely different ways.

      4 replies →

    • For CarPlay, yes. I don't need a virtual assistant to do things I can do but worse; I need reliable voice controls to send messages, start phone calls, change the map destination and such with as little friction as possible.

      Siri needs faster and more flexible handling of Spotify, Google Maps and third-party messaging apps, not a slop generator.

    • Only for opening/closing the garage door, setting timers, and sending texts. What else do people use the digital assistants for?

    • Hundreds of times a day for HomeKit, though rarely anything else. It’s _mostly_ fine, provided there are no HomePods around.

    • Only when I wake up in the middle of the night to ask it what is the current time of the dystopia. That and the calculator.

  • Unless Apple is lying:

    On iPhone, Settings → iCloud → Storage → Siri → Disable and Delete

    Edit: Tried it. It works for me. Takes a minute though.

    • I have a current case open with Apple with this issue. It does not work. And I don’t believe you. I’m sorry I just don’t believe you because Apple says there is a technical problem preventing this. That does not just affect me. Because I also tried it on three other phones of three other friends of mine and it does not work.

  • You're using Siri? lmao

    That's the Internet Explorer of chatbots.

    • That's where people get confused - it's not a chatbot or an LLM - it's a voice command interface. Adding something to the shopping list, setting a timer, turning up the heating in the back room, playing some music, skipping a track, sending a message - it works perfectly well for - and that's what I use it for virtually every day.

      This work is to turn it it into something else, more like a chatbot, presumably

      1 reply →

    • Jeez, I only use it for the time and for the calculator, and to ask it to call someone. I am shocked anyone thinks I used it for anything more than that.

      Also, I have never turned on Apple "Intelligence".

given that gemini 3 pro is presumably a relatively small model it wouldn't be too surprising to see an even more optimized model fit into latest iphones. I wish we knew the data behind gemini 3 flash because if my estimation that it's <50b is true, holy shit.

  • Google has Gemini Nano for on-device capabilities but basically never uses it and defers to cloud models instead

It tells you how bad their product management and engineering team is that they haven’t just decided to kill Siri and start from scratch. Siri is utterly awful and that’s an understatement, for at least half a decade.

I always said that Apple does not have the cloud GPU power to deliver a good AI experience to their huge customer base, no matter how good their developers are. They have to find someone with a lot of GPUs that can handle the workload, and that's Google.

This seems like a pretty significant anti-trust issue. One of the two mobile OS makers is using a product from the other for its AI assistance. And that means that basically all mobile devices will be using the same AI technology.

I don't expect the current US government to do anything about it though.

  • What antitrust rule do you think would be breached?

    I admit I don't see the issue here. Companies are free to select their service providers, and free to dominate a market (as long as they don't abuse such dominant position).

    • Gatekeeping - nobody else can be the default voice assistant or power Siri, so where does this leave eg OpenAI? The reason this is important is their DOJ antitrust case, about to start trial, has made this kind of conduct a cornerstone of their allegations that Apple is a monopoly.

      It also lends credence to the DOJ's allegation that Apple is insulated from competition - the result of failing to produce their own winning AI service is an exclusive deal to use Google while all competing services are disadvantaged, which is probably not the outcome a healthy and competitive playing field would produce.

      9 replies →

    • Apple and Google have a duopoly on Mobile OS. If Apple uses Google's model for Siri, that means Apple and Google are using their duopoly in one market (mobile OS) to enforce a monopoly for Google in another (model for mobile personal assistant AI).

      1 reply →

Weird thing is, Gemini didn't crack speech-to-speech yet. They have a product but both Anthropic and Google are visibly suffering in speech-to-speech vs OpenAI voice mode.

How Apple has made some of the greatest phones in history, amazing engineering, and a lot more, but just can't make a simple model to run locally on the phone when many others did?

i think it's good. Google has a record of being stable and working with large partners (govt etc) and avoids the controversial cult of altman.

I used Gemini heavily in 2025. It was very bad and frustrating to use. Shame on Google for shipping it in the condition they did. No quality control. No pride in craft.

Google's strategy is as unreadable as ever. It feels like two companies fighting each other in one.

On the one hand, they apparently want to be a service provider Microsoft-style. They are just signing a partnership with their biggest competitor and giving them access to their main competitive advantage, the most advanced AI available.

On the other hand, they want to be another Apple. They are locking down their phone. Are competing with the manufacturers of the best Android phones. Are limiting the possibility of distributing software on their system. Things that were their main differentiator.

It doesn't make sense. It's also a giant middle finger to the people who bought the Pixel for Gemini. Congrats, you were beta testers for iPhone users who won't have to share their data with Google for training Gemini. I have rarely seen a company as disrespectful to its customer.

I hope Apple will make it a smooth integration. The chatgpt integration in Apple Intelligence is not smooth. Constantly wants user input („confirm you are 13+“ at every request)

It's hilarious how Apple can't compete in the space and so many people here are just saying "Smart move by Apple" as if they had another choice at this point. It's not like they haven't tried.

  • If they wanted to, they could throw massive amounts of cash on it like Google and Facebook are, with the latter poaching Apple employees with 200$ million pay packages: https://www.bloomberg.com/news/articles/2025-07-09/meta-poac...

    But why on earth would they do that? It's both cheaper and safer to buy Google's model, with whom they already have a longstanding relationship. Examples include the search engine deal, and using Google Cloud infrastructure for iCloud and other services. Their new "private cloud compute" already runs on GCP too, perfect! Buying Gemini just makes sense, for now. Wait a few years until the technology becomes more mature/stable and then replace it with their own for a reasonable price.

    • Why did they even have Ruoming Pang on staff? Because they were trying. Failing, and then saying we're waiting is objectively hilarious.

    • No, they couldn't, because all current and future ethe training hardware is already tied up by contracts from the frontier labs. Apple could not simply buy its way in given how constricted the supply is.

      1 reply →

Will Apple and Google merge now? That would create a new #1, bigger than NVidia.

It would take US antitrust approval, but under Trump, that's for sale.

In other news Yahoo! picks Bing to become its new search engine... This is that stupid.

But, but, didn’t they say two years ago that they picked OpenAI to power apple intelligence? Where did that go? Wasn’t Scam Altman gay enough for Tim Apple?

Why couldn't Apple pull their finger out of their asses and make their own AI nonsense better then Crap GPT?

Steve Jobs rolling in his grave. The mortal enemy. Thermonuclear war.

  • Enemies? Google contributes about 20% of Apple's profits annually through their default search engine deal, that's more profitable than just about everything they do or make except selling iPhones.

    > The U.S. government said Apple Chief Executive Officer Tim Cook and Google CEO Sundar Pichai met in 2018 to discuss the deal. After that, an unidentified senior Apple employee wrote to a Google counterpart that “our vision is that we work as if we are one company.”

    https://www.bloomberg.com/news/articles/2020-10-20/apple-goo...

  • The original iPhone came pre-loaded with Google search, Maps, and Youtube. Jobs competed with Google but he also knew Google had best-in-class products too.