Comment by ricardobeat

7 days ago

Unlikely. Frosted glass blur was introduced almost twelve years ago in iOS 7, and was supported all the way down to the iPhone 4. Many apps like control center have used a full screen blur without any performance issues for a long time.

Apple at the time created their own 'approximate gaussian blur' algorithm specifically to enable this, and it ran crazy fast on devices where a simple gaussian blur would barely achieve double digit FPS. Even if this 'liquid glass' effect is heavier to compute, on the hardware we have today it will be a negligible performance concern.

> Unlikely. Frosted glass blur was introduced almost twelve years ago in iOS 7, and was supported all the way down to the iPhone 4. Many apps like control center have used a full screen blur without any performance issues for a long time.

"Without any performance issues"? Entirely false - reviews at the time noted iOS 7 dramatically reduced battery life - all across the board for Apple devices, even for the then latest iPhone 5S and 5c (https://arstechnica.com/gadgets/2013/09/ios-7-thoroughly-rev...).

The abuse of transparency/translucency in the UI was the primary reason - you could go to Accessibility settings and disable animations + transparency/translucency and get notable increases in both runtime speed of the OS UI and battery life.

  • Indeed, I remember the switch to iOS 7, for me battery life seemed to get slightly worse but there were conflicting opinions at the time. It's fresh in my memory as it was around the same time I binged on all five seasons of Breaking Bad :)

    I's also true that iOS 7 made the 4/4S seem much slower, but the frosted glass effect still ran at 60FPS - that was my point. It was really impressive at the time. Though unless you spent hours sliding the control center up and down, it's hard to blame the blur effect for the reduced battery life, as it rarely appeared inside apps. Most likely the result of increased OS bloat and proliferation of background services.

  • You can’t judge battery life and performance off a .0 release when the priority is on delivering features with the minimum number of showstopper bugs. At least wait until the .1.

    It has been like this for every Apple release for over 20 years.

    • Maybe for "Apple", but there's one team that takes performance seriously. The WebKit team has a zero tolerance policy for performance regressions (https://webkit.org/performance/) dating back to the implementation of the Page Load Test in 2002 (Creative Selection, p. 93).

      WebKit sounds like the kind of scrappy startup Apple might want to acquire and gain some hard-earned engineering knowledge.

      2 replies →

    • If Apple has been shipping betas for 2 decades that do not meaningfully prepare the release candidate for users, something is horribly wrong. They're either not listening to the feedback they receive or they're not giving themselves enough time; both are firmly within Apple's control.

      2 replies →

    • > number of showstopper bugs

      Screwing with the battery life on a mobile device would be a showstopper bug if Steve were still around.

This isn't just a gaussian blur though, there's raytracing and refractions happening. The OS is becoming a low-key high-fidelity video game.

  • I don't usually say things are bloated but raytracing buttons is something I'd expect to be a parody...

    And all of this just to make the whole UI white and generic.

    I just want everything to look like Windows XP. I don't get it.

    • It’s almost certain to be a fairly cheap thing, at least for a GPU that can sling pixels at the gigabytes per second necessary to get smooth touch scrolling at these screen resolutions.

      The demos only show a very limited array of shapes. Precompute the refraction, store the result in a texture, and the gist should be sample(blur(background), sample(refraction, point)). Probably a bit more complicated than this—I’m no magician of the kind that’s needed to devise cheap graphics tricks like this—but the computational effort should be in that ballpark. Compared to on-device language models and such, I wouldn’t be worried.

      (Also, do I need to remind you of the absolute disdain directed by 95/98/Me/2000 users at the “toy” default theme of XP? And it was a bit silly, to be honest. It’s just that major software outfits don’t dare to be silly anymore, and that way lies blandness.)

      5 replies →

    • > And all of this just to make the whole UI white and generic.

      3:30–3:45 in the video is painful. Describing “giving you an entirely new way, to personalise your experience”, while showing… white. White white white. Oh, and light tinted backgrounds to set your white on. I hope the personalisation you wanted was white.

      2 replies →

    • Make things slow so they can sell more hardware to make it look faster?

      I don’t know, just kidding :-)

      If GPUs can handle it, I guess why not. It’s some people will notice and say “wow, looks pretty, glad I upgraded”

  • From what I've seen,the refractions happen in predictable contexts so I suspect that they'll be able to create shaders, etc that will limit the performance hit

  • I would imagine that for a known geometry of glass, you can do the ray tracing once, see where each photon ends up, and then bake that transformation into the UI. If you do this for each edge and curve your UI will produce, you can stitch them together piecewise to form UI elements of different shapes without computing everything again from scratch.

  • it looks like old school 2D bumpmapping to me, it's not expensive if you don't overengineer it

  • where do you see raytracing? it's just reading back the texture of the layer behind a bit distorted. honestly that's cheaper than a blur

Early iPhone hardware was barely keeping with rendering the UI with a total ban on transparency. Even on iPhone 4 which improved the hardware a lot had the issue that it also increased amount of pixels to be pushed around.

And yes, later iOS on early hardware was huge PITA and slowdown.

Yes! And it was frustratingly patented! https://news.ycombinator.com/item?id=34937618

I suspect that their new technique implements the existing fast gaussian blur, and since the patent is about to expire, it was a good time to spice it up.

I suspect as others have mentioned here, they use a "Liquid Glass" shader which samples the backing layer of the UI composition below the target element and applies a lens distortion based on the target element's border radius, all heavily parameterized so as to be used with the rest of the system's Liquid Glass applications like the new icon system.

“Supported” and “works well” ain’t the same. Do you remember how your iPhone 4 crawled when that effect was enabled?

Surely it's a performance nightmare because whatever is behind the frosting has to be rendered in full. Without this it can see that it's occluded and not have to render. Or does MacOS not do that?