← Back to context

Comment by doctorpangloss

8 months ago

Whom is WebGPU for, besides Unity?

Responding to your pre-edited comment.

> whom is WebGPU for? […] it’s a huge rigamarole.

Where is this comment coming from? WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it. What is making you think game engines would be the only users? I bet a lot of companies are looking forward to being able to use compute shaders in JS apps and web pages.

> Godot has had zero development for WebGPU support.

Why would Godot be an indicator? I love Godot and their efforts, but it’s less than 1% of game engine market share, and a much smaller less well funded team. Of course they’re not on the bleeding edge. Unity is closer to 30% market share and is actively engaging with WebGPU, so it seems like you’re downplaying and contradicting a strong indicator.

  • I edited my comment.

    > WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it.

    I know.

    If you have to go through a giant product like Unity for example to use WebGPU because Apple will essentially have its own flavor of WebGPU just like it has its own flavor of everything, is it really cross platform?

    Does Apple support Vulkan? No. It was invented for middlewares!

    Apple has a flag to toggle on WebGPU on iOS today. I know dude. What does that really mean?

    They have such a poor record of support for gamey things on Mobile Safari. No immersive WebXR, a long history of breaking WASM, a long history of poor WebGL 2 and texture compression support. Why is this going to be any different?

    • I’m still not sure what the point is. WebGPU is an API, is that that you mean by middleware? What’s the issue? Apple will do their own thing, and they might not allow WebGPU on Safari. What bearing does that have on what people using Linux, Windows, Firefox, and Chrome should do? And where exactly is this cross platform claim you’re referring to?

      13 replies →

    • Apple submitted Metal as a web spec and they turned this into WebGPU and Apple got everything they asked for to avoid apple going rogue again. The fear that Apple of all companies is going to drop WebGPU support is really not based in reality.

    • > because Apple will essentially have its own flavor of WebGPU

      Apple's WebGPU implementation in Safari is entirely spec compliant, and this time they've actually been faster than Firefox.

      3 replies →

The number 1 use of WebGL is Google Maps by several orders of magnitude over any other use. At some point they'll likely switch to WebGPU making it the number 1 use of Google Maps. Google went over what this enables when they shipped it. Lots of features including being able to highlight relevant roads that change depending on what you searched for.

https://www.youtube.com/watch?v=HrLyZ24UcRE

Apple maps and others also use it

Web video editor(https://chillin.online), we are eagerly looking forward to the WebGPU API maturing and being extended to all major browsers, enabling faster rendering, bringing more effects, and facilitating the rendering and editing of 3D assets.

Devs in the future? There was a long time between when WebGL2 released and when it finally worked "everywhere" too.

I'm new to GPU programming, and WebGPU and Rust's wgpu seem pretty nice to me (except the bindings!). The API is high-level enough that it's easy to learn. It hasn't grown deprecated parts or vendor-specific extensions yet.

Our browser based game engine Construct (https://www.construct.net) supports rendering with both WebGL and WebGPU.