← Back to context

Comment by doctorpangloss

8 months ago

I edited my comment.

> WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it.

I know.

If you have to go through a giant product like Unity for example to use WebGPU because Apple will essentially have its own flavor of WebGPU just like it has its own flavor of everything, is it really cross platform?

Does Apple support Vulkan? No. It was invented for middlewares!

Apple has a flag to toggle on WebGPU on iOS today. I know dude. What does that really mean?

They have such a poor record of support for gamey things on Mobile Safari. No immersive WebXR, a long history of breaking WASM, a long history of poor WebGL 2 and texture compression support. Why is this going to be any different?

I’m still not sure what the point is. WebGPU is an API, is that that you mean by middleware? What’s the issue? Apple will do their own thing, and they might not allow WebGPU on Safari. What bearing does that have on what people using Linux, Windows, Firefox, and Chrome should do? And where exactly is this cross platform claim you’re referring to?

  • > Apple will do their own thing, and they might not allow WebGPU on Safari.

    Safari has WebGPU support today, albeit behind a feature flag until it's fully baked. https://imgur.com/a/b3spVWd

    Not sure if this is good, but animometer shows an Avg Frame time of ~25.5 ms on a Mac Studio M1 Max with Safari 18.2 (20620.1.16.11.6). https://webgpu.github.io/webgpu-samples/sample/animometer/

    • The demo is doing a setBindGroup per triangle, so not exactly surprising since this is a well known bottleneck (Chrome's implementation is better optimized but even there setBindGroup is a surprisingly slow call). But since both implementations run on top of Metal there's no reason why Safari couldn't get at least to the same performance as Chrome.

  • The issue is, it's likely that a company with $2 BILLION spent on product development and a very deep relationship with Apple, like Unity, will have success using WebGPU the way it is intended, and nobody else will. So then, in conclusion, WebGPU is designed for Unity, not you and me. Unity is designed for you and me. Are you getting it?

    • > The issue is, it's likely that a company with $2 BILLION spent on product development and a very deep relationship with Apple, like Unity, will have success using WebGPU the way it is intended, and nobody else will.

      Not really. Bevy https://bevyengine.org uses WebGPU exclusively, and we have unfortunately little funding - definitely not $2 billion. A lot of the stuff proposed in the article (bindless, 64-bit atomics, etc) is stuff we (and others) proposed :)

      If anything, WebGPU the spec could really use _more_ funding and developer time from experienced graphics developers.

      4 replies →

    • It seems like you’ve jumped to and are stuck on a conclusion that isn’t really supported, somewhat ignoring people from multiple companies in this thread who are actively using WebGPU, and it’s not clear what you want to have happen or why. Do you want WebGPU development to stop? Do you want Apple to support it? What outcome are you advocating for?

      Unity spends the vast majority of its money on other things, and Unity isn’t the only company that will make use of WebGPU. Saying nobody will have success with it is like saying nobody will succeed at using CUDA. We’re just talking about compute shaders. What is making you think they’re too hard to use without Apple’s help?

    • You haven't substantiated why nobody else could make use of WebGPU. Are Google the only ones who can understand Beacons because they make $300B/year? GPU is hard, but it doesn't take billions to figure out.

Apple submitted Metal as a web spec and they turned this into WebGPU and Apple got everything they asked for to avoid apple going rogue again. The fear that Apple of all companies is going to drop WebGPU support is really not based in reality.

> because Apple will essentially have its own flavor of WebGPU

Apple's WebGPU implementation in Safari is entirely spec compliant, and this time they've actually been faster than Firefox.

  • I wish Apple made a standalone Webgpu.framework spinning it off of WebKit so that apps can link to it directly instead of having to link to Dawn/wgpu.

    • Sounds like an interesting idea at first until the point where they will probably create a Swift/ObjC API around it instead of the standard webgpu.h C API, and at that point you can just as well use Metal - which is actually a bit less awkward than the WebGPU API in some areas.

      1 reply →