Comment by Gigachad
14 hours ago
These are software/cloud features. You can install gemini on iphone if you want to talk about towers in Chicago.
The only reason to care about it being OS integrated is to interact with functions of the OS, which siri does fine.
Apple's AI stuff also uses cloud features, though you can't use them on other platforms. The problem with Apple's new cloud features is that they generally just suck. I'm surprised iCloud works so well with how hard they're fumbling basic stuff like this.
At least all of the ones I have tried work locally. I’ve entered airplane mode and things like magic eraser in images works fine.
Siri does not do it fine, it's literally the example the above commenter showed.
Knowing the building heights around Chicago is not an OS feature. Even if Siri was perfect, they still aren't going to ship a wikipedia object graph on every phone.
Likewise, the phone does not understand removing people from a photo. It is a feature specific to the photo app, and Siri allows you to wire in commands for the features in your app just fine and has for years. If Google decided for competitive reasons to not ship this feature to non-Pixel or non-Android users, thats not a Siri fault. That Apple did not integrate this as a voice command into their Photos app is also not a Siri fault (is it really common to remove all people from a photo, vs specific people?)
> Hey Siri start the Chronometer / There is no contact named Chronometer in your phone
Is what I was referring to, Siri often fails at even opening apps which is an OS feature. Regardless, even for your examples at a certain point an AI assistant not being able to do certain things while others can does become the fault of that AI.