Comment by dahart
8 months ago
Responding to your pre-edited comment.
> whom is WebGPU for? […] it’s a huge rigamarole.
Where is this comment coming from? WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it. What is making you think game engines would be the only users? I bet a lot of companies are looking forward to being able to use compute shaders in JS apps and web pages.
> Godot has had zero development for WebGPU support.
Why would Godot be an indicator? I love Godot and their efforts, but it’s less than 1% of game engine market share, and a much smaller less well funded team. Of course they’re not on the bleeding edge. Unity is closer to 30% market share and is actively engaging with WebGPU, so it seems like you’re downplaying and contradicting a strong indicator.
I edited my comment.
> WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it.
I know.
If you have to go through a giant product like Unity for example to use WebGPU because Apple will essentially have its own flavor of WebGPU just like it has its own flavor of everything, is it really cross platform?
Does Apple support Vulkan? No. It was invented for middlewares!
Apple has a flag to toggle on WebGPU on iOS today. I know dude. What does that really mean?
They have such a poor record of support for gamey things on Mobile Safari. No immersive WebXR, a long history of breaking WASM, a long history of poor WebGL 2 and texture compression support. Why is this going to be any different?
I’m still not sure what the point is. WebGPU is an API, is that that you mean by middleware? What’s the issue? Apple will do their own thing, and they might not allow WebGPU on Safari. What bearing does that have on what people using Linux, Windows, Firefox, and Chrome should do? And where exactly is this cross platform claim you’re referring to?
> Apple will do their own thing, and they might not allow WebGPU on Safari.
Safari has WebGPU support today, albeit behind a feature flag until it's fully baked. https://imgur.com/a/b3spVWd
Not sure if this is good, but animometer shows an Avg Frame time of ~25.5 ms on a Mac Studio M1 Max with Safari 18.2 (20620.1.16.11.6). https://webgpu.github.io/webgpu-samples/sample/animometer/
1 reply →
The issue is, it's likely that a company with $2 BILLION spent on product development and a very deep relationship with Apple, like Unity, will have success using WebGPU the way it is intended, and nobody else will. So then, in conclusion, WebGPU is designed for Unity, not you and me. Unity is designed for you and me. Are you getting it?
9 replies →
[dead]
Apple submitted Metal as a web spec and they turned this into WebGPU and Apple got everything they asked for to avoid apple going rogue again. The fear that Apple of all companies is going to drop WebGPU support is really not based in reality.
> because Apple will essentially have its own flavor of WebGPU
Apple's WebGPU implementation in Safari is entirely spec compliant, and this time they've actually been faster than Firefox.
I wish Apple made a standalone Webgpu.framework spinning it off of WebKit so that apps can link to it directly instead of having to link to Dawn/wgpu.
2 replies →