> The current roadmap lists Shadow DOM and CSS Grid as priorities
I've been working on the CSS Grid support. About to land "named grid lines and areas" support which should make a bunch more websites layout correctly.
I'm biased because it's my project, but IMO the approach Servo is using for CSS Grid is pretty cool in that the actual implementation is in an external library (Taffy [0]) that can be used standalone and is widely used accross the Rust UI ecosystem, including in the Blitz [1] web engine (which also uses Taffy for Flexbox and Block layout), the Zed [2] text editor, and the Bevy [3] game engine.
I'm hopeful that this approach of breaking down a web engine into independently usable modules with public APIs (which builds upon Servo's earlier work on modular libraries such as Stylo and html5ever) will make it easier for people to get involved in web engine development (as they can understand each piece in isolation), and make it easier for people to create new web engines in future (as they won't have to start completely from scratch).
> (Taffy [0]) that can be used standalone and is widely used accross the Rust UI ecosystem, including in the Blitz [1] web engine (which also uses Taffy for Flexbox and Block layout)
This is the first time I hear about Blitz. Looks equally interesting and ambitious. It is probably the real undercover web engine. Servo was widely known around when Rust debuted.
> this is the first time I hear about Blitz. Looks equally interesting and ambitious. It is probably the real undercover web engine
It's certainly a newer and lesser-known engine. It's mostly been me working on it for the past year or so (with a couple of other occasional contributors). But I do have funding to work on it full time through DioxusLabs (who are building Dioxus Native - a Flutter / React Native competitor on top of it) and NLnet (who are a non-profit interested in the alternative web browser use case).
We're trying to really push on the modular side of things to create a web engine that's flexible / hackable and can be moulded for a variety of use cases.
Is Dioxus (or Leptos) much more performant than Tauri/Electron?
I want to (1) build blindingly fast, low-latency, super performant UX for users, which precludes Tauri/Electron (something I'm currently using and unhappy about), but I also want to (2) maintain developer velocity, (3) have access to nice UX primitives and widgets, and (4) have it look nice and modern.
Javascript/browser-oriented frameworks make requirements 2-4 easy, and it has the side benefit of also making hiring easy (not a requirement per se). But the results feel so bloated and anti-Desktop/native. It gobbles up RAM and renders slowly, even when best practices are used. It's the very definition of a double-edged sword.
Are these four requirements simply impossible to satisfy together for native Rust UX toolkits right now?
Rust's egui looks amazing, but I doubt you'd be able to build a very complicated UX with it. Or if you could, it might take you half a year to deliver.
Iced also looks cool, but looks less featureful.
Are there any "non-browser" Rust UX toolkits that aren't dated GTK/KDE frameworks, but that can build graphically-oriented (not just text/button widget) programs?
If I were building a "slimmed down photoshop", are there any Rust GUI toolkits to reach for? Or if I were incorporating a Bevy or 3D render pane?
Does having experience implementing a web browser engine feature change the way you write HTML or CSS in any way? Do you still google "css grid cheatsheet" three times a week like the rest of us?
> Does having experience implementing a web browser engine feature change the way you write HTML or CSS in any way?
I think I'm more concious of what's performant in CSS. In particular, both Flexbox and CSS Grid like to remeasure things a lot by default, but this can be disabled with a couple of tricks:
- For Flexbox, always set `flex-basis: 0` and `min-width: 0`/`min-height: 0` if you can without affecting the layout. This allows the algorithm to skip measuring the "intrisic" (content-based) size.
- For CSS Grid, the analogous trick is to use `minmax(0, 1fr)` rather than just `1fr`.
(I also have a proposal for a new unit that would make it easier to get this performance by default, but I haven't managed to get any traction from the standards people or mainstream browsers yet - probably I need to implement it and write it up first).
> Do you still google "css grid cheatsheet" three times a week like the rest of us?
Actually no. The process of reading the spec umpteen times because your implementation still doesn't pass the tests after the first N times really ingrains the precise meanings of the properties into your brain
> breaking down a web engine into independently usable modules with public APIs
I love that. Years ago I looked at WebRTC and asked myself, why does it seem like making a crummy Skype knockoff is either 50 lines of JavaScript or a nightmarish week of trying to get half of Chromium to compile in your C++ project?
I think there is a WebRTC Rust crate finally, which is good. Web apps shouldn't be the only beneficiaries of all this investment
Not really. Everything has a spec, so it's pretty clear what's in scope and what isn't. Also, we don't have time for bloat with our team size. We have to be laser focussed on what actually makes a difference to real websites. It's Chrome that has bloat!
Also the specs don't change much (there are additions, but almost always backwards-compatible), so once written the code shouldn't need too much maintenance.
I see it as writing a piece of foundational infrastructure. You can write your own HTTP library. But you don't need to: there are high-quality libraries readily available (e.g. curl or hyper). Similarly for HTML parsers or 2D graphics rendering. Nobody's done that for web layout yet (except arguably React Native's Yoga [0], but that's much less ambitious in terms of how much of the spec it attempts to support), so everybody has to write their own. But it's possible so we're doing it.
Shadow DOM and CSS Grid don't strike me as bloat or fragmentation in any way? CSS Grid in particular is table stakes that would be part of any laser focus.
I feel like Mozilla is going to join the annals of history with the likes of Xerox in the category of "Companies that created the technology of the future and casually tossed it to the wayside for competitors to scoop up" with Rust and Servo.
It's mind-boggling that for a company so often seemingly playing catch-up with Google, Mozilla actually leapfrogged Google in the browser development space for a time, and then...decided it wasn't worth pursuing any further.
Mozilla is nothing like Xerox, if anything Google is the new Xerox: they have way too much money so they throw it at R&D side projects without a business plan.
The big one for Google is transformer models, they basically invented LLMs only to play catchup with OpenAI later.
Mozilla successes have always been focused on the web browser, from the very beginning. Even the name reflects that: "Mozilla" stands for "Mosaic killer", Mosaic was the leading web browser at the time. They beat Mosaic with Netscape, they beat IE with Firefox, beating Chrome was their mission, and Rust and Servo were their weapons. It is sad that they dropped the ball.
Like Xerox (and Google), Mozilla tried doing some side projects, but unlike Xerox, they didn't have money to burn from their quasi-monopolies, and I can't think of anything particularly innovative coming from Mozilla that isn't a browser. I don't consider Rust to be a side project, it is a programming language for writing a browser, that it is useful for projects other than a web browser is a happy side effect.
> […] and I can't think of anything particularly innovative coming from Mozilla that isn't a browser. I don't consider Rust to be a side project, it is a programming language for writing a browser, that it is useful for projects other than a web browser is a happy side effect.
I’m sure Rust started out as something intended to help with their browser work. But it became a general purpose programming language pretty early, right? I think it is… working pretty hard to find a reason to not include Rust as a innovative, non-browser piece of tech.
Anyway, I don’t really think it detracts from your broader point to count Rust as a separate thing from the browser.
> It's mind-boggling that for a company so often seemingly playing catch-up with Google, Mozilla actually leapfrogged Google in the browser development space for a time, and then...decided it wasn't worth pursuing any further.
Google has been Mozilla's main source of revenue since around 2006. For Mozilla to exist, all they have to do is keep Google happy.
It's kind of a nice deal for Mozilla, despite being a huge conflict of interest.
And it's an even better deal for Google. "But look, Chrome is not a monopoly."
If I was Google, I'd do something to set Mozilla back on that track. But oh well, Google these days is even more dysfunctional. They're about to lose search.
Mozilla is over. They put all their eggs in the Google basket, and soon they’ll lose that. They have no viable path forward.
Servo and Ladybird are the future. I’m astounded by how quickly Ladybird is proceeding, with far fewer people than Mozilla. It’s been inspiring to see what that project is doing.
Honestly, dumping gecko would be a suicide for Firefox. Much of Firefox's minuscule userbase consists of knowledgeable nerds who are holdouts from the Blink monoculture.
It is difficult to get people to pay for it. People happily pay for 10€ beer but asked ”friends” about how to bypass WhatsApp’s 0,99 lifetime licenses.
Servo would not have been some dramatic revolution in browser technology. It would have done the same thing browser engines currently do, but maybe a little faster and with a couple less security issues per year.
It's dishonest to compare that to "the technology of the future", especially when nobody chooses their browser based on whether a page loads in 82ms or 70ms. They mostly choose browsers based on familiarity and marketing and what's preinstalled. Chrome is fast enough for most people that they're not gonna switch just because their dweeb nephew mentioned some other browser might be marginally faster.
And they didn't toss Rust aside either. New bits of Firefox continue to be written in Rust to this day. That was actually the primary reason Servo was dropped - the bits of the browser engine that could be replaced with Servo bits easily had already been replaced. Rewriting Gecko slowly was deemed more practical than committing to 5+ years of parallel development and hoping that by the end it would be possible to replace millions of lines of Gecko code in one fell swoop.
Mozilla doesn't deserve to survive. New players deserve our support, like servo and ladybird.
Even with an enormous budget from Google (500M, I think per year) they managed to ruin everything, including Firefox, the thing bringing them those 500M.
To me it looks as if Baker is an undercover person put there to sabotage Mozilla. Tldr: funded by Google, made absolutely everything in her power to run it into the ground
Unlike historical examples like Stephen Elop that moved from Microsoft to Nokia and buried their mobile division only to return to Microsoft, Mitchell Baker was with Mozilla since the start.
I'm not sure about that. Baker was one of the first Netscape employees, she literally helped found the Mozilla foundation, and she served as the first president of it.
I'm not saying she has done a good job, but a lot of the early Netscape people like Brendan Eich have done nothing but sing her praises.
As an April fools joke, Mozilla should announce that they are discontinuing Firefox in order to focus on their core business, which is a beautiful abstraction: the Platonic ideal of discontinuing popular products.
The string of departures and how people communicated around them let me think it’s something a lot less nefarious: Mozilla looked like an extremely political company at the time.
Servo was extremely good at communication with their very frequent news letter and Rust had a lot of wind in its sails, I wouldn’t be surprised if that ruffled the wrong feathers. Mozilla is very much still managed by what remains of its old guards - by that I mean what hasn’t been poached - especially at the top.
That would be pure incompetence from the top management but not malice.
A lot of things look like incompetence at a distance, but get really fascinating up close. Does anyone know care to share what they know about the particular personalities, drives, and visions?
I still strongly believe Servo can be a real counterpoint to Chrome/Chromium's hegemony in the long haul. Not sure why Mozilla ditched it nor why The Linux Foundation gives little to no support at all to it.
> nor why The Linux Foundation gives little to no support at all to it
The Linux Foundation is mostly a dumping ground for dead and dying projects. Particularly they seem to specialize in abandoned commercial open source projects.
I dont think the Foundation provides much, if any, developer funding for these projects. They list $193M in "project support" expenses but host over 1000 projects.
WebKit is a nice competitor, too. Look at Orion browser, it's a pretty decent competitor. Although they only target macOS, WebKit can be used on Windows and Linux, too.
Because Mozilla benefits from Google's donations (the majority comes from Google), and being a counterpoint to Google's Chrome is bad for Google, which means less or no donations to Mozilla. Google holds the key here. They have leverage over Mozilla.
I think Mozilla makes a lot of sense if you consider the following long term strategic goal: Become independent of Google money. None Google income has grown to 150M$ in 2023, up from 80M$ the year before. Mozilla has used dramatically more of the Google money to build up assets than it spends on advocacy or other projects that irl some people so. In 2023 they had 1B$ in investments. Net assets have been going up by 100M+ per year.
They are not yet in striking distance to truly become independent, but they are making significant steps in that direction. The share of Google money in their revenue went from 90% in 2020 to 75% in 2023.
I don't think following the money actually shows what you think it does.
As a postscript:
Damned if they do, damned if they don't. There were plenty of people at the time arguing that Firefox maintaining one independent browser engine was idiotic and they should just switch to Chromium like everyone else. People like to lambast Mozilla over relatively minor advocacy spending stuff and cry that it should just focus on Firefox, but insist it should have obviously continued with Servo. Even though Servo probably wouldn't have made a substantial difference to Firefox post Quantum for a very long time.
At what point could FireFox had just invested the money from Google into the SP500 and then just ran the company off of passive income?
Like for 150M$ I bet you could fund browser development for at least a decade and that was just 1 year of income. (of course also burn the entire $150M).
The majority of Mozilla’s revenue came through Firefox—their flagship product and by far their most recognized project.
And yet, somehow, they still struggle to secure adequate funding for Firefox itself, while millions are allocated to executive salaries and various advocacy initiatives.
I think people are implying that Google told Mozilla to drop Servo, to make sure Firefox wouldn't leapfrog Chrome. And since Google funds Mozilla almost entirely, Mozilla had to comply.
Almost no technical aspect of Firefox have anything to do with how much money Google pays them.
Mozilla is basically paid by Google to have a multi-platform non-Blink browser around they can point to when accused of being a monopoly.
Having a quality browser is not required, merely it existing. So why waste money on novel web engine experiments when you can have AI conferences in Zambia?
It's wrong to name names without prood but after a point you start doubting your understanding of the world and at this point I'm beginning to suspect Mitchell Baker is a plant
I think without Mitchell Baker there would probably not have been a Mozilla. I'm fuzzy on the history but I believe she was the lawyer who originally set up the organization.
I just don't get the point of ladybird.
They have full time engineers and are soliciting donations, so it's clearly more than a hobby project.
Maybe my assumptions are off, but I just can't imagine they could ever become competitive in terms of features, security and performance with the big engines. Blink is setting the pace, Webkit is barely able to keep up and Gecko is slowly falling behind. All of these teams are orders of magnitudes larger than the Ladybird team.
If you think that Blinks dominance is a thread to the web it's not enough to have an alternative engine you need enough adoption of that engine so web devs make sure their site is compatible with that engine.
Most of this also applies to Servo, but at least their technical project goals (embeddable, modular, parallel, memory safe) sound at least moderately compelling. Maybe Ladybird has similar goals, but at least their website doesn't really state any technical goals.
It is donation funded with no reliance on outside parties. They don't have to inject ads into pages like brave did or sell out to Google compromising their independence on web standards.
They're ahead of Servo already anyway, and better funded.
Not being funded by Google money is a pretty big deal. Some of the developers are former webkit devs so they have a good foundation to start from. It remains to be seen if they can pull it off.
Orion adding Windows support (getting WebKit running on Windows again) would be pretty good too.
Larger teams do not necessarily mean you get stuff faster. If anything after some point, a large team can be hard to get things moving and have tons of issues with communication.
Well Andreas Kling has worked on Safari and WebKit and (obviously) has talked to a lot of browser people. He knows what he is doing, and he frequently says that no one that has actually worked on a browser thinks it's impossible to create a new one, even with a small team (...of highly motivated and skilled people).
1) Apple and Firefox have enough resources to implement the most recent web standards. When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.
2) Devs aren’t created equal. It’s possible for a team of 8 people to be 10x more productive than another team of 8.
If anything, Ladybird is an independent implementation of the web standards, and the devs have identified and helped solving quite a few bugs and and ambiguities in the standards, which benefits everyone, from browser devs including the big guns to web developpers and users.
I think the browser attempts are all wrong trying to "cover all edge cases" they should focus on being able to transform any and all dark patterns down into something simpler.
Hell starting out as extracting text, pictures, video for all nightmare sites. Then slowly add whatever features dont actually lead to dark patterns
I have to be honest that I don't really understand the appeal of Ladybird from a purely technical perspective. It is written in C++, just like all the existing engines (yes there is some Swift, but it is negligible), so what benefit does it provide over Gecko or Blink? With Servo, I can see there is a distinct technical design around security and parallelism.
There is more to a project this massive than its choice of language. For me though it's mostly about breaking from the monopoly and putting a check on Google's influence over browser space.
Many many factors to consider. Simplistic take: KHTML was picked up by Apple because of its clean design and codebase; there's an extra 30 years of accumulated improvements in C++; you don't write stuff in Rust 1.0 either.
Also: Andreas has worked on Webkit/Blink. He knew what he was doing when he started the project, even if it was "just for fun". Linux started under similar circumstances.
Rust does not produce magic in the assembly code. It does not even produce faster assembly code. Rust toolchain on it's own does not even produce assembly code. It just passes that to LLVM that does THE ENTIRE optimization. Without LLVM (written in C++) doing the heavy lifting, Rust is probably slower than V8 (written in C++) running JavaScript. There's no technical marver in Servo compared to Ladybird. I don't understand the yapping how a language makes projects good/bad, despite it being proven completely false again and again. The appeal is in independence and politics around the project.
I hope it succeeds. Now they allow direct donations, so people who want it to succeed can help. I am sure these donations go directly into the development of the browser unlike with mozilla.
Looks interesting, but they're going to try writing it in Swift?
If you're writing a browser engine in C++, I may not like it, but I can see that you're pragmatic and are focused on the end result rather than the language. If you're writing it in Rust, okay, you maybe have your eyes on that pie in the sky, but you've chosen a language that, at least potentially, has the ability to replace C++ in creating bedrock software.
Any other language and I feel like someone with a lot of political capital at the company just has a personal preference for the language and so, "yeah, we're going to rewrite it all in Swift"[0].
I mean, you're writing a browser. Do you really want to build it in a language that is at the "it's improving" stage of support for the most popular operating systems?
You should look into why they chose it and what their implementation plan is. They evaluated multiple languages, including rust. There were some specific issues with rust that made it unsuitable for them. Swift was a bit of a dark horse candidate that the developers ended liking.
There is no immediate plan to switch to anything so it’s still C++. They may not ever switch. Swift’s cross platform support isn’t there yet and that’s a prerequisite.
Wow, just did this myself. The difference between Firefox and Chromium was depressing.
I'm getting >100 FPS almost consistently all the way up to 1,000 in Chromium, FF barely made it above 500 before dropping below 60.
Probably not a completely fair test, I've no extensions etc. on Chromium and only use it for the odd stubborn website, but that was quite a lesson in their comparative performance.
> This is a danger to the open web in more ways than one. If there is only one functioning implementation of a standard, the implementation becomes the standard.
I still don't understand why this is a problem. As long as the engine implementing the spec - governed by committee formed by entities other than Google itself - is open source. The problem and the waste of resource is how we are behaving now.
The browser engine should become as the Linux Kernel: one engine and different distros.
Even with the best intentions, the implementation is going to have bugs and quirks that weren't meant to be the standard.
When there's no second implementation to compare against, then everything "works". The implementation becomes the spec.
This may seem wonderful at first, but in the long run it makes pages accidentally depend on the bugs, and the bugs become a part of the spec.
This is why Microsoft has a dozen different button styles, and sediment layers of control panels all the way back to 1990. Eventually every bug became a feature, and they can't touch old code, only pile up new stuff around it.
When you have multiple independent implementations, it's very unlikely that all of them will have the same exact bug. The spec is the subset that most implementations agree on, and that's much easier to maintain long term, plus you have a proof that the spec can be reimplemented.
Bug-compatibility very often exposes unintended implementation details, and makes it hard even for the same browser to optimize its own code in the future (e.g. if pages rely on order of items you had in some hashmap, now you can't change the hashmap, can't change the hash function, can't store items in a different data structure without at least maintaining the old hashmap at the same time).
Is that so bad though? It's essentially what's already the case and as you said the developers already have an incentive to avoid making such bugs. Most developers are only going to target a single browser engine anyways, so bug or not any divergence can cause end users problems.
I disagree. The more browser engines in use the less damage any one security exploit can do. This matters even with memory safe languages, because logic errors can be just as damaging, e.g. the Log4j exploit.
Bugs are proportional to lines of code. More browser implementations with result in many more bugs. All the effort of reimplementing multiple times would be better put towards security a single browser engine if security is what you are trying to go after. Also you don't need a single exploit due to defense in depth of browser engines. You have to chain multiple exploits together.
In theory yes. In practice, only when the interests of the sole maintainer are aligned with the interests of the users; since these can change, it’s best to avoid a monopoly.
Case in point, recent manifest v2 deprecation is generally disliked by the community and no long term blink based alternative exists (that I know of).
It is disliked by a vocal minority, and of that minority even fewer actually have their own opinion based off the current version of mv3 as opposed to mindlessly parroting others. If it was a true issue then others would be maintaining mv2 support long term. In regards to monopolies in terms of control what is important is the product, browser, market share itself as opposed to that of the browser engine.
> The browser engine should become as the Linux Kernel: one engine and different distros.
Try spending a month with a BSD - personally, I recommend OpenBSD, or (yes) macOS. Alternatively, try ZFS where you've previously used mdadm+lvm+luks+etc.
The difference is like sitting in a chair that has all of its screws tightened. You only notice how bad was your previous chair once you feel there's no wobbling.
Google is killing Manifest V2, and AFAIK all downstream Chromium-based browsers (Brave, Edge, Vivaldi, Opera, etc) will eventually be affected. That should say enough about why having multiple browser engines is a good thing.
Standards already skew heavily to what Google or Google connected individuals want. If everything was Chromium based the situation would be even worse.
Maybe worse is what we need to realize that many of the W3C and WHATWG standards are past the point of saving, and those organisation are no longer a good avenue for further advancement of the web.
"Most sites have at least a few rendering bugs, and a few are completely broken. Google search results have many overlapping elements, and the MacRumors home page crashed after some scrolling. Sites like Wikipedia, CNN Lite, my personal site, and text-only NPR worked perfectly."
Like many HN readers, I have read countless accounts of web browsers and web browsing over the years.
Unfortunately, I cannot recall even one that took an account such as this one and concluded something like, "We need to modify the Google and MacRumors pages so they work with Servo." Unfortunately, the conclusion is usually something like, "We need to fix Servo so it works like Chromium/Chrome."
The reason I believe this is unfortunate is that (a) it ultimately places control in an ad services company and (b) it creates the wrong incentives for people who create web pages. Pages could be modified to conform to what Wikipedia, CNN Lite, the author's personal site and text-only NPR have done. This is not difficult. In fact, it is easier than modifying Servo to do what Chromium/Chrome is doing.
IMO, the "standard" web browser should not be effectively defined by an ad services company (including its business partner, Mozilla) nor should the standard for a web page be defined by the "most popular" web browser, To me "popular" and "standard" are not necessarily the same. Web _pages_ (cf. web _browsers_) should work with unpopular browsers and popular browsers alike, According to OP, Wikipedia, CNN Lite, the author's personal site, and text-only NPR may meet the standard.
In sum, fix web pages not web browsers.
As a hobbyist, I still compile and experiment with w3c's original libww library and utilities. Below is short script I use to compile static binaries. With a TLS forward proxy these utilities, with few modifications, if any, can still work very well for me for retrieving web pages on today's web. (I am only interested in learning www history and optimising text retrieval, not graphics.) This library is generally "ancient" on the www timescale and yet it still works 30 years later. That's useful for www users like me, but maybe not for the online ad services companies and sponsored web browsers optimised for data collection and surveillance. Internet is supposed to be a public resource not a private one, i.e., highest priority of www pages and www browsers should be to serve www users not online ad service providers.
# previous: download and compile w3c-libwww-5.4.2
pwd|grep "w3c-libwww-"||exec echo wrong directory
export x=$(pwd)
export examples=$x/Library/Examples
export linemode=$x/LineMode/src
export commandline=$x/ComLine/src
export robot=$x/Robot/src
y="
libwwwinit.a libwwwapp.a libwwwhtml.a
libwwwtelnet.a libwwwnews.a libwwwhttp.a
libwwwmime.a libwwwgopher.a libwwwftp.a
libwwwdir.a libwwwcache.a libwwwstream.a
libwwwfile.a libwwwmux.a libwwwtrans.a
libwwwcore.a libwwwutils.a
$x/modules/md5/.libs/libmd5.a -lm"
cd $x/Library/src/.libs
for z in
head libapp_1 libapp_2 libapp_3 libapp_4 init chunk
chunkbody LoadToFile postform multichunk put post
trace range tzcheck mget isredirected listen
eventloop memput getheaders showlinks showtags
showtext tiny upgrade cookie
do
gcc -s -static -O2 -Wall -o $examples/$z $examples/$z.o $y
done
gcc -static -s -O2 -Wall -o $linemode/www
$linemode/www-HTBrowse.o $linemode/www-GridText.o
$linemode/www-ConView.o $linemode/www-GridStyle.o
$linemode/www-DefaultStyles.o
$x/PICS-client/src/.libs/libpics.a $y
gcc -static -s -O2 -Wall -o $robot/webbot
$robot/webbot-HTRobot.o $robot/webbot-RobotMain.o
$robot/webbot-RobotTxt.o $robot/webbot-HTQueue.o $y
gcc -static -s -O2 -Wall -o $commandline/w3c
$commandline/w3c-HTLine.o $y
# next: symlink binaries to a folder in $PATH
# or export PATH=$PATH:$examples:$commandline:$robot:$linemode
Form a design team to reduce the amount of primitives in html such that I can do everything that html + css does but with the minimum amount of primitives.
Things get Rust advertised in the title because the Rust community is somewhat on a quest to assert itself. There are certain things that the language will only be allowed to do if it becomes seen as one of the "big" languages, and advertising the language in a million small projects is a collective shove toward that aim.
> the MacRumors home page crashed after some scrolling
I know there’s a ton going on, with GPUs and rendering and all kinds of things, but I guess because Rust’s memory safety and “no null pointers!” are so constantly hyped (especially in conversations about Go), that I’m always surprised when you fire up a Rust app and do something and it crashes out…
[To be clear, I’m a big fan of modern sum types, and like to imagine an alternate reality where Go had them from the start…]
> The current roadmap lists Shadow DOM and CSS Grid as priorities
I've been working on the CSS Grid support. About to land "named grid lines and areas" support which should make a bunch more websites layout correctly.
I'm biased because it's my project, but IMO the approach Servo is using for CSS Grid is pretty cool in that the actual implementation is in an external library (Taffy [0]) that can be used standalone and is widely used accross the Rust UI ecosystem, including in the Blitz [1] web engine (which also uses Taffy for Flexbox and Block layout), the Zed [2] text editor, and the Bevy [3] game engine.
I'm hopeful that this approach of breaking down a web engine into independently usable modules with public APIs (which builds upon Servo's earlier work on modular libraries such as Stylo and html5ever) will make it easier for people to get involved in web engine development (as they can understand each piece in isolation), and make it easier for people to create new web engines in future (as they won't have to start completely from scratch).
[0]: https://github.com/DioxusLabs/taffy [1]: https://github.com/DioxusLabs/blitz [2]: https://zed.dev [3]: https://bevy.org/
> (Taffy [0]) that can be used standalone and is widely used accross the Rust UI ecosystem, including in the Blitz [1] web engine (which also uses Taffy for Flexbox and Block layout)
This is the first time I hear about Blitz. Looks equally interesting and ambitious. It is probably the real undercover web engine. Servo was widely known around when Rust debuted.
> this is the first time I hear about Blitz. Looks equally interesting and ambitious. It is probably the real undercover web engine
It's certainly a newer and lesser-known engine. It's mostly been me working on it for the past year or so (with a couple of other occasional contributors). But I do have funding to work on it full time through DioxusLabs (who are building Dioxus Native - a Flutter / React Native competitor on top of it) and NLnet (who are a non-profit interested in the alternative web browser use case).
We're trying to really push on the modular side of things to create a web engine that's flexible / hackable and can be moulded for a variety of use cases.
We'd love more contributors, so if anyone is interested in getting involved then drop by our GitHub (https://github.com/DioxusLabs/blitz/) or Discord (https://discord.gg/AnNPqT95pu - #native channel)
9 replies →
Questions for the Rust UX experts:
Is Dioxus (or Leptos) much more performant than Tauri/Electron?
I want to (1) build blindingly fast, low-latency, super performant UX for users, which precludes Tauri/Electron (something I'm currently using and unhappy about), but I also want to (2) maintain developer velocity, (3) have access to nice UX primitives and widgets, and (4) have it look nice and modern.
Javascript/browser-oriented frameworks make requirements 2-4 easy, and it has the side benefit of also making hiring easy (not a requirement per se). But the results feel so bloated and anti-Desktop/native. It gobbles up RAM and renders slowly, even when best practices are used. It's the very definition of a double-edged sword.
Are these four requirements simply impossible to satisfy together for native Rust UX toolkits right now?
Rust's egui looks amazing, but I doubt you'd be able to build a very complicated UX with it. Or if you could, it might take you half a year to deliver.
Iced also looks cool, but looks less featureful.
Are there any "non-browser" Rust UX toolkits that aren't dated GTK/KDE frameworks, but that can build graphically-oriented (not just text/button widget) programs?
If I were building a "slimmed down photoshop", are there any Rust GUI toolkits to reach for? Or if I were incorporating a Bevy or 3D render pane?
26 replies →
Wasn’t rust and servo co-developed?
1 reply →
Does having experience implementing a web browser engine feature change the way you write HTML or CSS in any way? Do you still google "css grid cheatsheet" three times a week like the rest of us?
> Does having experience implementing a web browser engine feature change the way you write HTML or CSS in any way?
I think I'm more concious of what's performant in CSS. In particular, both Flexbox and CSS Grid like to remeasure things a lot by default, but this can be disabled with a couple of tricks:
- For Flexbox, always set `flex-basis: 0` and `min-width: 0`/`min-height: 0` if you can without affecting the layout. This allows the algorithm to skip measuring the "intrisic" (content-based) size.
- For CSS Grid, the analogous trick is to use `minmax(0, 1fr)` rather than just `1fr`.
(I also have a proposal for a new unit that would make it easier to get this performance by default, but I haven't managed to get any traction from the standards people or mainstream browsers yet - probably I need to implement it and write it up first).
> Do you still google "css grid cheatsheet" three times a week like the rest of us?
Actually no. The process of reading the spec umpteen times because your implementation still doesn't pass the tests after the first N times really ingrains the precise meanings of the properties into your brain
9 replies →
To me, this is like asking if having built a sql engine if you write SQL differently.
I think the answer is yes. Having a strong understanding of the underlying engine helps you debug and optimize more quickly
Needing reference stuff never changes unless it’s so frequently used.
> breaking down a web engine into independently usable modules with public APIs
I love that. Years ago I looked at WebRTC and asked myself, why does it seem like making a crummy Skype knockoff is either 50 lines of JavaScript or a nightmarish week of trying to get half of Chromium to compile in your C++ project?
I think there is a WebRTC Rust crate finally, which is good. Web apps shouldn't be the only beneficiaries of all this investment
Do you not worry that instead it will lead to feature bloat and fragmentation? If you’re going to strike at Google, you need laser focus.
Not really. Everything has a spec, so it's pretty clear what's in scope and what isn't. Also, we don't have time for bloat with our team size. We have to be laser focussed on what actually makes a difference to real websites. It's Chrome that has bloat!
Also the specs don't change much (there are additions, but almost always backwards-compatible), so once written the code shouldn't need too much maintenance.
I see it as writing a piece of foundational infrastructure. You can write your own HTTP library. But you don't need to: there are high-quality libraries readily available (e.g. curl or hyper). Similarly for HTML parsers or 2D graphics rendering. Nobody's done that for web layout yet (except arguably React Native's Yoga [0], but that's much less ambitious in terms of how much of the spec it attempts to support), so everybody has to write their own. But it's possible so we're doing it.
[0]: github.com/facebook/yoga/
Shadow DOM and CSS Grid don't strike me as bloat or fragmentation in any way? CSS Grid in particular is table stakes that would be part of any laser focus.
That's so fun to hear, I've been using taffy for layouting my little rusty eink calendar
I feel like Mozilla is going to join the annals of history with the likes of Xerox in the category of "Companies that created the technology of the future and casually tossed it to the wayside for competitors to scoop up" with Rust and Servo.
It's mind-boggling that for a company so often seemingly playing catch-up with Google, Mozilla actually leapfrogged Google in the browser development space for a time, and then...decided it wasn't worth pursuing any further.
Mozilla is nothing like Xerox, if anything Google is the new Xerox: they have way too much money so they throw it at R&D side projects without a business plan.
The big one for Google is transformer models, they basically invented LLMs only to play catchup with OpenAI later.
Mozilla successes have always been focused on the web browser, from the very beginning. Even the name reflects that: "Mozilla" stands for "Mosaic killer", Mosaic was the leading web browser at the time. They beat Mosaic with Netscape, they beat IE with Firefox, beating Chrome was their mission, and Rust and Servo were their weapons. It is sad that they dropped the ball.
Like Xerox (and Google), Mozilla tried doing some side projects, but unlike Xerox, they didn't have money to burn from their quasi-monopolies, and I can't think of anything particularly innovative coming from Mozilla that isn't a browser. I don't consider Rust to be a side project, it is a programming language for writing a browser, that it is useful for projects other than a web browser is a happy side effect.
> […] and I can't think of anything particularly innovative coming from Mozilla that isn't a browser. I don't consider Rust to be a side project, it is a programming language for writing a browser, that it is useful for projects other than a web browser is a happy side effect.
I’m sure Rust started out as something intended to help with their browser work. But it became a general purpose programming language pretty early, right? I think it is… working pretty hard to find a reason to not include Rust as a innovative, non-browser piece of tech.
Anyway, I don’t really think it detracts from your broader point to count Rust as a separate thing from the browser.
2 replies →
> It's mind-boggling that for a company so often seemingly playing catch-up with Google, Mozilla actually leapfrogged Google in the browser development space for a time, and then...decided it wasn't worth pursuing any further.
Google has been Mozilla's main source of revenue since around 2006. For Mozilla to exist, all they have to do is keep Google happy.
It's kind of a nice deal for Mozilla, despite being a huge conflict of interest.
And it's an even better deal for Google. "But look, Chrome is not a monopoly."
If I was Google, I'd do something to set Mozilla back on that track. But oh well, Google these days is even more dysfunctional. They're about to lose search.
2 replies →
Mozilla is over. They put all their eggs in the Google basket, and soon they’ll lose that. They have no viable path forward.
Servo and Ladybird are the future. I’m astounded by how quickly Ladybird is proceeding, with far fewer people than Mozilla. It’s been inspiring to see what that project is doing.
If Mozilla decides to dump Gecko, then it's time for a hard fork and an abandonment of Mozilla. Maybe it's time now.
edit: I mean dumping Gecko for Chromium, not Servo.
Honestly, dumping gecko would be a suicide for Firefox. Much of Firefox's minuscule userbase consists of knowledgeable nerds who are holdouts from the Blink monoculture.
2 replies →
It is difficult to get people to pay for it. People happily pay for 10€ beer but asked ”friends” about how to bypass WhatsApp’s 0,99 lifetime licenses.
I've heard the same thing about monetising search engines. Kagi didn't get the memo, and last I've heard they're turning a profit.
Sure, only nerds would pay for it, but not all products have to capture 100% market share.
7 replies →
And it's impossible to get people to pay for it when you give them no way to pay for it.
7 replies →
Some did. Some like me lined up eagerly to pay the $1 yearly license and would have been happy to pay for my family too.
> People happily pay for 10€ beer
I would not be happy to pay that much for a beer
Servo would not have been some dramatic revolution in browser technology. It would have done the same thing browser engines currently do, but maybe a little faster and with a couple less security issues per year.
It's dishonest to compare that to "the technology of the future", especially when nobody chooses their browser based on whether a page loads in 82ms or 70ms. They mostly choose browsers based on familiarity and marketing and what's preinstalled. Chrome is fast enough for most people that they're not gonna switch just because their dweeb nephew mentioned some other browser might be marginally faster.
And they didn't toss Rust aside either. New bits of Firefox continue to be written in Rust to this day. That was actually the primary reason Servo was dropped - the bits of the browser engine that could be replaced with Servo bits easily had already been replaced. Rewriting Gecko slowly was deemed more practical than committing to 5+ years of parallel development and hoping that by the end it would be possible to replace millions of lines of Gecko code in one fell swoop.
This is only fair.
Mozilla doesn't deserve to survive. New players deserve our support, like servo and ladybird.
Even with an enormous budget from Google (500M, I think per year) they managed to ruin everything, including Firefox, the thing bringing them those 500M.
To me it looks as if Baker is an undercover person put there to sabotage Mozilla. Tldr: funded by Google, made absolutely everything in her power to run it into the ground
Unlike historical examples like Stephen Elop that moved from Microsoft to Nokia and buried their mobile division only to return to Microsoft, Mitchell Baker was with Mozilla since the start.
I'm not sure about that. Baker was one of the first Netscape employees, she literally helped found the Mozilla foundation, and she served as the first president of it.
I'm not saying she has done a good job, but a lot of the early Netscape people like Brendan Eich have done nothing but sing her praises.
1 reply →
Yup, wouldn't be the first one [0].
[0] https://en.m.wikipedia.org/wiki/Stephen_Elop
It's still baffling to me that Mozilla threw out Firefox's technical future
Very little about Mozilla makes sense --- until you follow the money.
The end of Pocket is just another sad example.
As an April fools joke, Mozilla should announce that they are discontinuing Firefox in order to focus on their core business, which is a beautiful abstraction: the Platonic ideal of discontinuing popular products.
Strange. I remember reading nothing but complaints about Pocket when they bought and integrated it. I guess it grew on people.
7 replies →
That they're being bought by Google so that they can focus on the Platonic ideal of discontinuing popular products.
The string of departures and how people communicated around them let me think it’s something a lot less nefarious: Mozilla looked like an extremely political company at the time.
Servo was extremely good at communication with their very frequent news letter and Rust had a lot of wind in its sails, I wouldn’t be surprised if that ruffled the wrong feathers. Mozilla is very much still managed by what remains of its old guards - by that I mean what hasn’t been poached - especially at the top.
That would be pure incompetence from the top management but not malice.
A lot of things look like incompetence at a distance, but get really fascinating up close. Does anyone know care to share what they know about the particular personalities, drives, and visions?
That would be pure incompetence from the top
For which they are compensated very well.
It just doesn't make sense does it?
1 reply →
The way this world is going, you shouldn't attribute to incompetence what can be attributed to malice.
I still strongly believe Servo can be a real counterpoint to Chrome/Chromium's hegemony in the long haul. Not sure why Mozilla ditched it nor why The Linux Foundation gives little to no support at all to it.
> nor why The Linux Foundation gives little to no support at all to it
The Linux Foundation is mostly a dumping ground for dead and dying projects. Particularly they seem to specialize in abandoned commercial open source projects.
I dont think the Foundation provides much, if any, developer funding for these projects. They list $193M in "project support" expenses but host over 1000 projects.
2 replies →
WebKit is a nice competitor, too. Look at Orion browser, it's a pretty decent competitor. Although they only target macOS, WebKit can be used on Windows and Linux, too.
2 replies →
Because Mozilla benefits from Google's donations (the majority comes from Google), and being a counterpoint to Google's Chrome is bad for Google, which means less or no donations to Mozilla. Google holds the key here. They have leverage over Mozilla.
3 replies →
I think Mozilla makes a lot of sense if you consider the following long term strategic goal: Become independent of Google money. None Google income has grown to 150M$ in 2023, up from 80M$ the year before. Mozilla has used dramatically more of the Google money to build up assets than it spends on advocacy or other projects that irl some people so. In 2023 they had 1B$ in investments. Net assets have been going up by 100M+ per year.
They are not yet in striking distance to truly become independent, but they are making significant steps in that direction. The share of Google money in their revenue went from 90% in 2020 to 75% in 2023.
I don't think following the money actually shows what you think it does.
As a postscript:
Damned if they do, damned if they don't. There were plenty of people at the time arguing that Firefox maintaining one independent browser engine was idiotic and they should just switch to Chromium like everyone else. People like to lambast Mozilla over relatively minor advocacy spending stuff and cry that it should just focus on Firefox, but insist it should have obviously continued with Servo. Even though Servo probably wouldn't have made a substantial difference to Firefox post Quantum for a very long time.
At what point could FireFox had just invested the money from Google into the SP500 and then just ran the company off of passive income?
Like for 150M$ I bet you could fund browser development for at least a decade and that was just 1 year of income. (of course also burn the entire $150M).
3 replies →
The majority of Mozilla’s revenue came through Firefox—their flagship product and by far their most recognized project.
And yet, somehow, they still struggle to secure adequate funding for Firefox itself, while millions are allocated to executive salaries and various advocacy initiatives.
2 replies →
What do you mean? Doesn't most of Mozilla's revenue come from Firefox?
I think people are implying that Google told Mozilla to drop Servo, to make sure Firefox wouldn't leapfrog Chrome. And since Google funds Mozilla almost entirely, Mozilla had to comply.
Almost no technical aspect of Firefox have anything to do with how much money Google pays them.
Mozilla is basically paid by Google to have a multi-platform non-Blink browser around they can point to when accused of being a monopoly.
Having a quality browser is not required, merely it existing. So why waste money on novel web engine experiments when you can have AI conferences in Zambia?
1 reply →
How do you think Firefox is making money, since it has no payed features? Hint: it has Google search as the default search engine.
14 replies →
Nope it comes from Google.
Every day I login into Hacker News. Every day, I see Mozilla FUD.
It's wrong to name names without prood but after a point you start doubting your understanding of the world and at this point I'm beginning to suspect Mitchell Baker is a plant
I think without Mitchell Baker there would probably not have been a Mozilla. I'm fuzzy on the history but I believe she was the lawyer who originally set up the organization.
1 reply →
While it's hard to know what comes of it, there is also https://ladybird.org/ to challenge to monopoly of Blink.
I just don't get the point of ladybird. They have full time engineers and are soliciting donations, so it's clearly more than a hobby project. Maybe my assumptions are off, but I just can't imagine they could ever become competitive in terms of features, security and performance with the big engines. Blink is setting the pace, Webkit is barely able to keep up and Gecko is slowly falling behind. All of these teams are orders of magnitudes larger than the Ladybird team. If you think that Blinks dominance is a thread to the web it's not enough to have an alternative engine you need enough adoption of that engine so web devs make sure their site is compatible with that engine. Most of this also applies to Servo, but at least their technical project goals (embeddable, modular, parallel, memory safe) sound at least moderately compelling. Maybe Ladybird has similar goals, but at least their website doesn't really state any technical goals.
It is donation funded with no reliance on outside parties. They don't have to inject ads into pages like brave did or sell out to Google compromising their independence on web standards.
They're ahead of Servo already anyway, and better funded.
4 replies →
Not being funded by Google money is a pretty big deal. Some of the developers are former webkit devs so they have a good foundation to start from. It remains to be seen if they can pull it off.
Orion adding Windows support (getting WebKit running on Windows again) would be pretty good too.
1 reply →
Ladybird has better results in web rendering tests than Servo, and slowly is gaining on Firefox.
They are already quite competitive.
9 replies →
Larger teams do not necessarily mean you get stuff faster. If anything after some point, a large team can be hard to get things moving and have tons of issues with communication.
Well Andreas Kling has worked on Safari and WebKit and (obviously) has talked to a lot of browser people. He knows what he is doing, and he frequently says that no one that has actually worked on a browser thinks it's impossible to create a new one, even with a small team (...of highly motivated and skilled people).
I think there are two things to keep in mind.
1) Apple and Firefox have enough resources to implement the most recent web standards. When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.
2) Devs aren’t created equal. It’s possible for a team of 8 people to be 10x more productive than another team of 8.
1 reply →
What's wrong with Webkit? It's super fast. I tested Orion browser recently.
1 reply →
If anything, Ladybird is an independent implementation of the web standards, and the devs have identified and helped solving quite a few bugs and and ambiguities in the standards, which benefits everyone, from browser devs including the big guns to web developpers and users.
I think the browser attempts are all wrong trying to "cover all edge cases" they should focus on being able to transform any and all dark patterns down into something simpler.
Hell starting out as extracting text, pictures, video for all nightmare sites. Then slowly add whatever features dont actually lead to dark patterns
It was the similar sentiment with 0xide.
I have to be honest that I don't really understand the appeal of Ladybird from a purely technical perspective. It is written in C++, just like all the existing engines (yes there is some Swift, but it is negligible), so what benefit does it provide over Gecko or Blink? With Servo, I can see there is a distinct technical design around security and parallelism.
There is more to a project this massive than its choice of language. For me though it's mostly about breaking from the monopoly and putting a check on Google's influence over browser space.
2 replies →
Many many factors to consider. Simplistic take: KHTML was picked up by Apple because of its clean design and codebase; there's an extra 30 years of accumulated improvements in C++; you don't write stuff in Rust 1.0 either.
Also: Andreas has worked on Webkit/Blink. He knew what he was doing when he started the project, even if it was "just for fun". Linux started under similar circumstances.
2 replies →
Rust does not produce magic in the assembly code. It does not even produce faster assembly code. Rust toolchain on it's own does not even produce assembly code. It just passes that to LLVM that does THE ENTIRE optimization. Without LLVM (written in C++) doing the heavy lifting, Rust is probably slower than V8 (written in C++) running JavaScript. There's no technical marver in Servo compared to Ladybird. I don't understand the yapping how a language makes projects good/bad, despite it being proven completely false again and again. The appeal is in independence and politics around the project.
1 reply →
I hope it succeeds. Now they allow direct donations, so people who want it to succeed can help. I am sure these donations go directly into the development of the browser unlike with mozilla.
Worth pointing out that you can also sponsor Servo development: https://github.com/sponsors/servo
Looks interesting, but they're going to try writing it in Swift?
If you're writing a browser engine in C++, I may not like it, but I can see that you're pragmatic and are focused on the end result rather than the language. If you're writing it in Rust, okay, you maybe have your eyes on that pie in the sky, but you've chosen a language that, at least potentially, has the ability to replace C++ in creating bedrock software.
Any other language and I feel like someone with a lot of political capital at the company just has a personal preference for the language and so, "yeah, we're going to rewrite it all in Swift"[0].
I mean, you're writing a browser. Do you really want to build it in a language that is at the "it's improving" stage of support for the most popular operating systems?
[0]: https://x.com/awesomekling/status/1822236888188498031
You should look into why they chose it and what their implementation plan is. They evaluated multiple languages, including rust. There were some specific issues with rust that made it unsuitable for them. Swift was a bit of a dark horse candidate that the developers ended liking.
There is no immediate plan to switch to anything so it’s still C++. They may not ever switch. Swift’s cross platform support isn’t there yet and that’s a prerequisite.
8 replies →
> The Dogemania test ran at a smooth 60 FPS on my M4 Pro MacBook Pro until reaching around 400 images
I ran Dogemania on Chrome until 1400 images at steady 60 FPS at which point I got bored and closed the tab.
Wow, just did this myself. The difference between Firefox and Chromium was depressing.
I'm getting >100 FPS almost consistently all the way up to 1,000 in Chromium, FF barely made it above 500 before dropping below 60.
Probably not a completely fair test, I've no extensions etc. on Chromium and only use it for the odd stubborn website, but that was quite a lesson in their comparative performance.
For me it was stable 120fps on firefox at 2200 and 60fps at 3200, using 200W in my RX 6750XT.
Chromium hit below 60fps at around 4000 and below 30fps at 6000 but used my integrated intel GPU all the time.
1 reply →
Chrome has thousands of developers working it. That is a lot of man hours for small optimizations.
If dogemania keeps the images still after animation isn't it embarrassingly optimizable? Why cant an Amiga do infinity of these?
It did! Infinite bobs.
Describing Servo as "new" is a stretch ;)
It started later than the other engines and it seems to have a number of great ideas the other engines haven't adopted yet.
I thought it was more like: rust made for the web browser Servo.
> This is a danger to the open web in more ways than one. If there is only one functioning implementation of a standard, the implementation becomes the standard.
I still don't understand why this is a problem. As long as the engine implementing the spec - governed by committee formed by entities other than Google itself - is open source. The problem and the waste of resource is how we are behaving now.
The browser engine should become as the Linux Kernel: one engine and different distros.
Even with the best intentions, the implementation is going to have bugs and quirks that weren't meant to be the standard.
When there's no second implementation to compare against, then everything "works". The implementation becomes the spec.
This may seem wonderful at first, but in the long run it makes pages accidentally depend on the bugs, and the bugs become a part of the spec.
This is why Microsoft has a dozen different button styles, and sediment layers of control panels all the way back to 1990. Eventually every bug became a feature, and they can't touch old code, only pile up new stuff around it.
When you have multiple independent implementations, it's very unlikely that all of them will have the same exact bug. The spec is the subset that most implementations agree on, and that's much easier to maintain long term, plus you have a proof that the spec can be reimplemented.
Bug-compatibility very often exposes unintended implementation details, and makes it hard even for the same browser to optimize its own code in the future (e.g. if pages rely on order of items you had in some hashmap, now you can't change the hashmap, can't change the hash function, can't store items in a different data structure without at least maintaining the old hashmap at the same time).
Is that so bad though? It's essentially what's already the case and as you said the developers already have an incentive to avoid making such bugs. Most developers are only going to target a single browser engine anyways, so bug or not any divergence can cause end users problems.
I disagree. The more browser engines in use the less damage any one security exploit can do. This matters even with memory safe languages, because logic errors can be just as damaging, e.g. the Log4j exploit.
Bugs are proportional to lines of code. More browser implementations with result in many more bugs. All the effort of reimplementing multiple times would be better put towards security a single browser engine if security is what you are trying to go after. Also you don't need a single exploit due to defense in depth of browser engines. You have to chain multiple exploits together.
In theory yes. In practice, only when the interests of the sole maintainer are aligned with the interests of the users; since these can change, it’s best to avoid a monopoly.
Case in point, recent manifest v2 deprecation is generally disliked by the community and no long term blink based alternative exists (that I know of).
>is generally disliked by the community
It is disliked by a vocal minority, and of that minority even fewer actually have their own opinion based off the current version of mv3 as opposed to mindlessly parroting others. If it was a true issue then others would be maintaining mv2 support long term. In regards to monopolies in terms of control what is important is the product, browser, market share itself as opposed to that of the browser engine.
> The browser engine should become as the Linux Kernel: one engine and different distros.
Try spending a month with a BSD - personally, I recommend OpenBSD, or (yes) macOS. Alternatively, try ZFS where you've previously used mdadm+lvm+luks+etc.
The difference is like sitting in a chair that has all of its screws tightened. You only notice how bad was your previous chair once you feel there's no wobbling.
That's more of a distro thing than "Linux being a few screws loose".
I vastly prefer the OpenBSD kernel too, but if you use a distro that does things properly such as NixOS, the BSDs are the ones that feel wonky.
1 reply →
Google is killing Manifest V2, and AFAIK all downstream Chromium-based browsers (Brave, Edge, Vivaldi, Opera, etc) will eventually be affected. That should say enough about why having multiple browser engines is a good thing.
Standards already skew heavily to what Google or Google connected individuals want. If everything was Chromium based the situation would be even worse.
Maybe worse is what we need to realize that many of the W3C and WHATWG standards are past the point of saving, and those organisation are no longer a good avenue for further advancement of the web.
I'm hearing Servo got rebooted because Valve giving $ to Igalia to reboot Servo project. Can anyone confirm this?
its really sad to see what mozilla turned into. a competitive browser company to activism. no wonder its core product started to wane
"Most sites have at least a few rendering bugs, and a few are completely broken. Google search results have many overlapping elements, and the MacRumors home page crashed after some scrolling. Sites like Wikipedia, CNN Lite, my personal site, and text-only NPR worked perfectly."
Like many HN readers, I have read countless accounts of web browsers and web browsing over the years.
Unfortunately, I cannot recall even one that took an account such as this one and concluded something like, "We need to modify the Google and MacRumors pages so they work with Servo." Unfortunately, the conclusion is usually something like, "We need to fix Servo so it works like Chromium/Chrome."
The reason I believe this is unfortunate is that (a) it ultimately places control in an ad services company and (b) it creates the wrong incentives for people who create web pages. Pages could be modified to conform to what Wikipedia, CNN Lite, the author's personal site and text-only NPR have done. This is not difficult. In fact, it is easier than modifying Servo to do what Chromium/Chrome is doing.
IMO, the "standard" web browser should not be effectively defined by an ad services company (including its business partner, Mozilla) nor should the standard for a web page be defined by the "most popular" web browser, To me "popular" and "standard" are not necessarily the same. Web _pages_ (cf. web _browsers_) should work with unpopular browsers and popular browsers alike, According to OP, Wikipedia, CNN Lite, the author's personal site, and text-only NPR may meet the standard.
In sum, fix web pages not web browsers.
As a hobbyist, I still compile and experiment with w3c's original libww library and utilities. Below is short script I use to compile static binaries. With a TLS forward proxy these utilities, with few modifications, if any, can still work very well for me for retrieving web pages on today's web. (I am only interested in learning www history and optimising text retrieval, not graphics.) This library is generally "ancient" on the www timescale and yet it still works 30 years later. That's useful for www users like me, but maybe not for the online ad services companies and sponsored web browsers optimised for data collection and surveillance. Internet is supposed to be a public resource not a private one, i.e., highest priority of www pages and www browsers should be to serve www users not online ad service providers.
(Back slashes were accidentally omitted)
Also typo: libww should be libwww
All-time opener
I’m afraid Servo is to be another Hurd.
I feel guis have taken a wrong turn. We need a simple composable language. I want entire theorems built from the least amount of axioms possible.
HTML and css is not it. The sheer complexity is what causes all the bs.
Show us the way!
Form a design team to reduce the amount of primitives in html such that I can do everything that html + css does but with the minimum amount of primitives.
2 replies →
[flagged]
[flagged]
Things get Rust advertised in the title because the Rust community is somewhat on a quest to assert itself. There are certain things that the language will only be allowed to do if it becomes seen as one of the "big" languages, and advertising the language in a million small projects is a collective shove toward that aim.
The terms you're looking for are: vocal minority and bubble.
Please bring this to iOS. WebKit is broken.
If you think WebKit is broken, you should actually download a build of Servo and try it out.
> the MacRumors home page crashed after some scrolling
I know there’s a ton going on, with GPUs and rendering and all kinds of things, but I guess because Rust’s memory safety and “no null pointers!” are so constantly hyped (especially in conversations about Go), that I’m always surprised when you fire up a Rust app and do something and it crashes out…
[To be clear, I’m a big fan of modern sum types, and like to imagine an alternate reality where Go had them from the start…]
When they say "it crashed" they probably mean it panicked or just failed to render; not a segfault.
Most languages have a way of saying "I haven't handled this case yet; just exit the program if you get here".
Rust also doesn’t consider DOS a failure condition.
Reading an invalid index, thats a panic.
It can also get oomkilled