Comment by treprinum

3 years ago

Not a single AR functionality was demoed; all that was shown was a 2D overlay on top of real-world view. No 3D computer vision segmenting objects on the screen, no placing 3D objects on real objects etc. It felt less capable than Hololens a few years back.

Extra stunning because iOS already has some quite fine AR capabilities. This was such a freakish ommission.

At my most charitable, maybe they didn't want to take the risk. They don't know what they would do with those capabilities. AR isn't their thing, as a platform provider. They need developers to go build stuff. By not showing anything, they emphasize what they control, and they show only things they have power over.

In short, the omission keeps the narrative from getting out ahead of the horse.

Yes. The other standout was no shared space AR. Meta is big into this with shared anchors.

They specifically spell out how they don't want it to be isolating, but every shot of someone using it to do things like watch movies or sports they are entirely by themselves in the experience. It feels like something went seriously wrong with some piece of this and they yanked a large segment of the AR stuff out. I don't believe they didn't try. I wonder what the story is.

The focus on Hololens style air tap seemed like an expectation adjustment to me, that it's more or less AppleLens.

What’s the use case for placing 3D objects on real objects?

  • For example, tutorial on how to fix/replace things, how do things work internally, playing chess, demo surgery etc.