Shipping WebGPU on Windows in Firefox 141

8 days ago (mozillagfx.wordpress.com)

As I see it, the current state of graphics API is worse now than the OpenGL era, despite its promises none of the modern API's are easier to use, truly portable and cross platform. Having to reinvent OpenGL by creating custom wrappers around Vulkan, Metal, DirectX12, etc is such a time waster as dropping strings and going back to raw char arrays in the name of performance on every modern language.

  • What promises were made, by whom? Graphics APIs have never been about ease of use as a first order goal. They've been about getting code and data into GPUs as fast as reasonably possible. DevEx will always play second fiddle to that.

    I think WebGPU is a decent wrapper for exposing compute and render in the browser. Not perfect by any means - I've had a few paper cuts working with the API so far - but a lot more discoverable and intuitive than I ever found WebGL and OpenGL.

    • > They've been about getting code and data into GPUs as fast as reasonably possible. DevEx will always play second fiddle to that.

      That's a tiny bit revisionist history. Each new major D3D version (at least before D3D12) also fixes usability warts compared to the previous version with D3D11 probably being the most convenient to use 3D API - while also giving excellent performance.

      Metal also definitely has a healthy balance between convenience and low overhead - and more recent Metal versions are an excellent example that a high performance modern 3D API doesn't have to be hard to use, nor require thousands of lines of boilerplate to get a triangle on screen.

      OTH, OpenGL has been on a steady usability downward trend since the end of the 1990s, and Vulkan unfortunately had continued this trend (but may steer into the right direction in the future:

      https://www.youtube.com/watch?v=NM-SzTHAKGo

      5 replies →

    • "What promises were made, by whom?"

      Technically true, but practically tone deaf.

      WebGPU is both years too late, and just a bit early. Wheras WebGL was OpenGL circa 2005, WebGPU is native graphics circa 2015. It shouldn't need to be said that the bleeding edge new standard for web graphics shouldn't be both 10 years out of date and awful.

      Vendors are finally starting to deprecate the old binding model as the byzantine machinery that it is. Bindless resources are an absolute necessity for the modern style of rendering with nanite and raytracing.

      Rust's WGPU on native supports some of this, but WebGPU itself doesn't.

      It's only intuitive if you don't realize just how huge the gap is between dispatching a vertex shader to render some triangles, and actually producing a lit, shaded and occlusioned image with PBR, indirect lighting, antialiasing and postfx. Would you like to render high quality lines or points? Sorry, it's not been a priority to make that simple. Better go study up on SDFs and beziers.

      Which, tbh, is the impression I get from webgpu efforts. Everyone forgets the drivers have been playing pretend for decades, and very few have actually done the homework. Of those that have, most are too enamored with being a l33t gfx coder to realize how terrible the dev exp is.

      13 replies →

  • I don't see the problem. There have been lower-level APIs in the graphics stack for a long time (e.g. Mesa's Gallium), only now they are standardised and people are actually choosing to use them. It's not like higher-level APIs don't exist now, OpenGL is still supported on reasonable platforms and WebGPU has been usable from native code for some time.

    As for true portability of those low-level APIs, you've basically got Apple to blame (and game console manufacturers, but I don't think anyone expected them to cooperate).

    • > you've basically got Apple to blame

      Yeah, that's the thing that really irks me. WebGPU could have been just a light wrapper over Vulkan like WebGL is (or was, it's complicated now) for OpenGL. But apple has been on a dumb war with Khronos for the last decade which has made everything more difficult.

      So now we have n+1 low level standards for GPU programming not because we needed them, but because 1 major player is obstinate.

      5 replies →

    • Your last paragraph is fairly revisionist to me.

      How is Apple solely to blame when there are multiple parties involved ? They went to Khronos to turn AMD’s mantle into a true unified next gen APi. Khronos and NVIDIA shot them down to further AZDO OpenGL. Therefore Metal came to be and then DX12 followed and then Vulkan when Khronos realized they had to move that way.

      But even if you exclude Metal, what about Microsoft and D3D? Also similarly non-portable. Yet it’s the primary API in use for non-console graphics. You rarely see people complaining about the portability of DX for some reason…

      And then in an extremely distant last place is Vulkan. Very few graphics apps actually use Vulkan directly.

      Have you tried writing any of the graphics APIs?

      10 replies →

  • I think what we learned from the OpenGL era is it's actually not very relevant all platforms use the same high level API to talk to the GPU hardware. What matters it the platform's chosen API offers good control of the hardware it uses.

    You say this requires reinvention but really the end work is "translate OpenGL to something the hardware can actually understand" in both scenarios. The difference with the OpenGL era is you did not have the option to avoid using the wrapper, not that no wrapper existed. Targeting the best of each possibly hardware type individually without baking in assumptions about the hardware has proven to not be very practical, but it only matters if you're building a "easy translation layer" rather than using it or trying to target specific types of hardware very directly (in which case you don't want something super generic or simple, you want something which exposes the hardware as directly as is reasonable for that hardware type).

    • Writing engines predates OpenGL, it has always been more a FOSS culture thing to think OpenGL exists everywhere due to its UNIX origins.

      1 reply →

  • OpenGL became a mess of an API after 2.0, and WebGPU is actually a fairly easy to use wrapper around Vk, D3D12, Metal - definitely better designed than what OpenGL has become.

    Apart from that, D3D11 and Metalv1 are probably the sweet spot between ease-of-use and performance (especially D3D11's performance is hard to beat even in Vulkan and D3D12).

    • D3D11 is so nice to use that I feel like the ideal workflow for ground-up graphics applications is to just write everything in D3D11 and then let Middleware layers on Linux (proton) or Mac (Game porting toolkit) handle the translation. DirectX also has a whole suite of first party software (DirectXMath, DirectXTK) which make a lot of common workflows much simpler.

      If only the windows team could get out of a tailspin because almost everything else MS produces on the Windows side gets worse and worse every year.

  • OpenGL is still such a powerful technology. I use it all the time because Vulkan is just so much more difficult to use. It's a pity, so much good software not being built because ogl is more or less a dead man walking

  • I agree. I'll keep using OpenGL with CUDA interop until something better shows up. Vulkan isn't it. I tried Vulkan to get away from OpenGL, but ended up with CUDA instead since it's so much nicer to work with. Vulkan has way too much overengineered complexity with zero benefit.

  • Not really, for example in the OpenGL era there was this urban myth that game consoles used OpenGL, this was never really the case.

    Nintendo after graduating to devkits where C and C++ could be used like N64, had OpenGL inspired APIs, which isn't really the same. Although there was some GLSL like shader support.

    They only started supporting Khronos APIs with the Switch, and even then, if you want the full power of the Switch, NVN is the way to go.

    Playstation always had proprietary APIs, they did a small stint OpenGL ES 1.0 + Cg, which had very little to no uptake among developers, and they dropped it from the devkits.

    Sega only had proprietary APIs, and there was a small collaboration with Microsoft for DirectX, which only a few studios took advantage of.

    XBox naturally has always been about DirectX.

    Go watch GDC Vault programming track to see how many developers you will find complaining about writing middleware for their game engines, if any at all, versus how many talks about taking the advantage of every little low level detail of hardware architecture.

    • Early console APIs were more similar to Direct3D 1, with very rudimentary immediate mode commands. Modern console APIs still have a less stateful, easy API layer, like D3D10/11, but also expose more low-level stuff, too.

      OpenGL didn't match the hardware well except on SGI hardware or carryover descendants like 3dfx.

  • IMO things have never been better.

    Vulkan works approximately everywhere (except Apple, but that's entirely self inflicted and there's a compatibility layer so it's NotMyProblem). OpenGL is more portable than ever thanks to software implementations that yield far more consistent behavior between platforms than was available historically. WebGPU is actually fairly nice to work with, has a well maintained native implementation for two major systems languages, and both of those implementations have (AFAIK) fully functional WASM support. If it happens to gain a native Mesa implementation once everything stabilizes that will merely be icing on the cake. OpenCL has multiple competing implementations, including PoCL which is an adapter providing decently broad support on top of other backends.

    And if you don't want to fiddle with native APIs (which no offense intended but you very clearly sound like you don't) there's quite a few choices available to abstract all the low level details away with cross platform cross API middleware which are FOSS and actively maintained.

  • Yeah, it’s kind of insane how things have gotten.

    There are no adults, no leaders with an eye on things leading us away from further mistakes, and we keep going deeper.

    • The point of the graphics APIs is to be as close to the metal as possible. It’s a balancing act between portability and hardware design/performance. I really don’t think it’s as trivial as non-graphics engineers make it out to be to make something universal.

      But even when it existed in the form of OpenGL , or now WebGPU, people complain about the performance overhead. So you end up back here.

      5 replies →

    • What would a leader do? Nvidia wants to sell hardware, Nintendo wants to sell games, Microsoft wants to either buy Linux or crush it. Nobody has a stake in things actually working

  • Modern graphics APIs are the graphical equivalent to assembly language - you are not supposed to use them directly, but through another higher level layer, like a programming language or a graphics engine.

    They are a specialized API intended for tool writers.

I'm still hoping that WebGPU somehow takes off for non-web use so that we have an easy to use cross platform API with an official spec (a replacement for opengl). However, it seems that outside of the Rust world, there doesn't seem to be much interest for using WebGPU for native code. I don't know any big projects using Dawn for example. Part of the reason seems to be that WebGPU came a bit too late and everyone was already using custom-built abstractions over dx, vulkan and metal.

  • It won't. It's barely simpler but lacks a lot of functionality. Some stuff that became optional in Vulkan (render passes) are still mandatory on WebGPU, and bind groups are static and thus cumbersome. It also adds additional limitations and cruft, like you can't easily transfer from host to a buffer subregion and need staging buffers.

    I'll use it for web since there is no alternative, but for desktop I'll stick with an OpenGL+CUDA interop framework until a sane, modern graphics API shows up. I.e., a graphics API that gets rid of render pases, static pipelines, mandatory explizit syncing, bindings and descriptor sets (simply use buffers and pointers), and all the other nonsense.

    If allocating and populating a buffer takes more effort than a simple cuMemAlloc and cuMemcpy, and calling a shader with arguments takes more than simply passing the shader pointers to the data, then I'm out.

    • ...that's assuming that the WebGPU API is set in stone, which hopefully it isn't.

      They'd do well to follow the D3D model (major breaking versions, while guaranteeing backward compatibility for older versions) - e.g. WebGPU2, WebGPU3, WebGPU4 each being a mostly new API without having to compromise for backward compatibility.

      3 replies →

  • Another reason may be that WebGPU didn't allow for as much optimization and control as Vulkan, and the performance isn't as good as Vulkan. WebGPU also doesn't have all the extensions that Vulkan has.

  • Some part of it is also probably the atrocious naming. I don't do anything with web, only native coding, so whenever I heard something about web gpu somewhere I just ignored it, for literally years, because I just assumed it was some new web tech and thus not relevant to me at all.

Very happy to see this as it means that our gpu-allocator [0] crate (used currently by wgpu's dx12 backend, but capable of supporting vulkan & metal as well) will see a significant wider audience then what we've been using it for so far (which is shipping our gpu benchmark suite: evolve [1]

[0]: https://github.com/Traverse-Research/gpu-allocator/

[1]: https://www.evolvebenchmark.com/

This is very exciting, congrats to the Firefox team!

My company is working to bring Unreal to the browser, and we've built out a custom WebGPU RHI for Unreal Engine 5.

Here are demos of the tech in action, for anyone interested:

(Will only work on Chromium-based browsers on desktop, and on some Android phones)

Cropout: https://play-dev.simplystream.com/?token=aa91857c-ab14-4c24-...

Car configurator: https://garage.cjponyparts.com/

  • > (Will only work on Chromium-based browsers on desktop, and on some Android phones)

    This post is about WebGPU in Firefox. Do you plan to test and/or release a Firefox-compatible version?

    • Is it possible that this is just that Firefox hasn't shipped yet and it's already spec compatible and just requires Firefox to match the spec?

  • On Firefox 142 (nightly):

    Cropout: After being stuck at 0% for a long while and 1200 network requests, it loads to a menu with a black background and will start a game but only UI elements show up. Seems to have a lot of errors parsing shaders, as well as a few other miscellaneous errors.

    Car configurator: Several errors while at 0% (never loads), the first among them being `[223402304]: MessageBox type 0 Caption Message Text Game files required to initialize the global shader and cooked content are most likely missing. Refer to Engine log for details.`

    I would concur with others that you should at least test this in Firefox before advertising it here.

  • In Google Chrome for macOS: 0%, and not moving, on the first link, and stops in 98% (sometimes in 97%) in the second one. Same with Safari.

    • IMO This is why Unreal (and Unity) for web is just not a good fit. Most games made in those engines use 100s of megs of assets. You download the 100-500meg file, to your hard drive, then run the game. That's not the web.

      To be good on the web requires designing your game to start immediately with the minimal amount of downloaded. Maybe stream some stuff in the background but be playable immediately. AFAICT neither Unreal nor Unity do that by default. You can maybe coerce them to do it but most devs don't. As such they get these bad experiences when they try to put their creation on the web

    • Sorry to hear that! I will say that usually if you wait long enough, it will eventually load. Try popping open your dev console sidebar, you should see assets downloading over the network.

      If it does crash, you'll be able to see why. I'd be interested in seeing any bug reports if you do fine some, we're always squashing bugs over here!

  • I keep seeing these posts and they never work on hardware that I actually own.

    Are we supposed to try them out on the same kind of high end gamer desktop setup requirements for the native version?

I hadn't realized WebGPU was already available on macOS in the Firefox Nightlies!

I just installed the Mac nightly from https://www.mozilla.org/en-US/firefox/channel/desktop/ and now this demo works: https://huggingface.co/spaces/reach-vb/github-issue-generato...

It runs the SmolLM2 model compiled to WebAssembly for structured data extraction. I previously thought that demo only worked in Chrome.

(If I try it in regular Firefox for Mac I get "Error: WebGPU is not supported in your current environment, but it is necessary to run the WebLLM engine.")

  • Member of Firefox's WebGPU team here. This is expected. Stable support for macOS is something we hope to ship soon! From the post:

    > Although Firefox 141 enables WebGPU only on Windows, we plan to ship WebGPU on Mac and Linux in the coming months, and finally on Android.

>we plan to ship WebGPU on Mac and Linux in the coming months, and finally on Android

Sounds good. I'm not really thrilled about it as of now. What ever the reason, it's not been supported in Linux for any browsers as of yet. My guess is it's too hard to expose without creating terrible attack surfaces.

This seems to support my view that web standards are too overgrown for how users actually use the web. It's obviously too late to do anything about it now but all the issues of monoculture and funding we are worried about today stem from the complexity of making a web browser due to decisions tracing all the way back to the days of Netscape.

  • Depends on which Linux, it is supported on Android/Linux, WebOS/Linux and ChromeOS/Linux.

    However it kind of proves the point on how relevant browser vendors see GNU/Linux for this kind of workloads.

Thanks, looking forward to the Linux implementation as well. Are there any webgpu demos worth trying when this is released?

Finally! Kudos for everyone involved into this.

I was feeling a bit dirty playing around with WebGPU with only Chrome into the game thus far, even Safari has enabled their preview quite recently.

Seems like Firefox will ship WebGPU on Linux before Chrome does, then.

  • Chrome already does, just not on GNU/Linux.

    It is available on Android/Linux, WebOS/Linux and ChromeOS/Linux.

    Which tells where they see the ..../Linux value for WebGPU.

  • Which is a bit weird honestly - since dawn (Google’s webgpu implementation) works pretty well on Linux.

I have been using wgpu for my main projects for nearly two years now. Let's hope this rollout means more maintainers so issues I have opened 18 months ago bug more people and eventually get resolved. Never touched rust myself but maybe I find the motivation and time to do it myself.

As I also depend on the wgpu-native bindings it's slow for updates to reach. Like we just got to v25 last week and v26 dropped a couple days prior to that.

What are the use cases for this? Are we sure sites are not just going to use it to mine bitcoins using their users' hardware?

  • - Streaming point cloud data setsnover web browsers (used by many surveying and construction companies, as well as geospatial government agencies).

    - Visualize other scan data such as gaussian splat data sets, or triangle meshes from photogrammetry

    - Things like google earth, Cesium, or other 3D globe viewers.

    It's a pretty big thing in geospatial sciences and industry.

  • Same use cases that native apps have for using a GPU except in a browser?

    > Are we sure sites are not just going to use it to mine bitcoins using their users' hardware?

    Some almost certainly will but like all similar issues the game of cat and mouse will continue.

  • It'll open the door for more ambitious webgames and web apps that use the GPU.

    • I keep waiting to see ambitious webgames that could match the experience of Infinity Blade from 2010, used to demo iOS new OpenGL ES 3.0 capabilities, the foundation of WebGL 2.0.

      https://en.wikipedia.org/wiki/Infinity_Blade

      Game demo, https://www.youtube.com/watch?v=_w2CXudqc6c

      The only thing I like in Web 3D APIs, is that outside middleware engines, they are the only mainstream 3D APIs designed with managed languages in mind, instead of after the fact bindings.

      Still waiting for something like RenderDoc on the respective browser developer tools, we never got anything better than SpectorJS.

      It isn't even printf debugging, rather pixel colour debugging.

      2 replies →

  • Probably primary use will be fingerprinting if WebGPU provides GPU name, libraries info like WebGL does.

  • You can already do that in WebGL, WASM or Javascript, thankfully all those technologies are easily ad-blockable.

Pretty sure Apple is going to release WebGPU support in Safari in Mac OS X 26 Tahoe as well. There was a WebGPU video in one of the WWDC sessions.

Are there enough devs working on uses for this? I was hoping for a resurgence like flash devs 20 years ago

  • There's no way you'll see anything like that. Flash was dead simple, a 12 year old could throw a simple game together and upload it. WebGPU will require a skilled graphics programmer just to write (or more likely cross compile) these weird shaders.

    And the SWF format had insane compatibility, literally unmatched by any other technology imo, we didn't even think about OS's, it really was "write once run anywhere" (pre-smartphone ofc). On the web, even basic CSS doesn't work the same from OS to OS, and WebGL apps still crash on 10% of devices randomly. It'll probably be 5 years before WebGPU is even remotely stable.

    Not even to mention the fully integrated editor environment.

    Or I guess maybe you're saying someone should build something like Flash targeting WebGPU? Probably the closest there is to that right now is Figma? But it feels weak too imo, and was already possible with WebGL. Maybe Unreal Engine is the bet.

  • If you follow things like three.js you'll be painfully aware that in truth there doesn't seem to be much use for this at all. "3D on the web" is something that sounds fun until it's possible at which point it becomes meh[1]. The exception proving the rule would be that Marble Madness promo game https://www.luduxia.com/ ), and what I learned is the moment it all works people just assume it was nothing and move on.

    • > You could absolutely have done web Minecraft years ago, and it's very revealing such a thing is not wildly popular.

      Minecraft started as a java applet in the browser, that's part of the reason it was able to gain such a rapid following.

    • There are so many blockers, versus old style Flash games.

      Driver and OS blacklisting, means that game developers aren't aware of the user experience, nor can they controll it, as in native games, or server side rendering with streaming.

      No proper debugging tools other than printf/pixel debugging.

      The amount of loading screens that would be needed, given memory constraints of browser sessions.

      This alone means there is hardly that much ROI for 3D webgames, and most uses end up being in ecommerce, or Google Maps kind of applications.

Very cool! Now, let see how much will take for G-products to actually use it and not complaining about "browser not supported for this feature, use Chrome".

  • Which google products use it?

    • The one I can think of is Google Meet, where some GPU-thing is used to add background effects such as blur. However I'm not sure this actually uses WebGPU; it used to use on Firefox until Google added a browser check, and AFAIK, if you could fool Meet to think Firefox was Chrome, it would still work.

      This might still be a semi-legitimate thing, i.e maybe they kept around a WebGL implementation for a while as a fallback but moved the main implementation to WebGPU and don't want to maintain the fallback. It certainly fits well into their strategy of making sure that the web really only works properly with Chrome.

      6 replies →

    • I don't know :) I was referring basically to the Meet situation back in the day with FF where the feature was there but Meet complained the browser was not capable.

      2 replies →