Our developers managed to run around 750MB per website open once.
They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.
We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.
Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
I still test mine on GPRS, because my website should work fine in the Berlin U-Bahn. I also spent a lot of time working from hotels and busses with bad internet, so I care about that stuff.
Developers really ought to test such things better.
For macOS users you can download the Network Link Conditioner preference pane (it still works in the System Settings app) to do this system wide. I think it's in the "Additional Tools for Xcode" download.
I had a fairly large supplier that was so proud that they implemented a functionality that deliberately (in their JS) slows down reactions from http responses. So that they can showcase all the UI touches like progress bars and spinning circles. It was an option in system settings you could turn on globally.
My mind was blown, are they not aware of F12 in any major browser? They were not, it seems. After I quietly asked about that, they removed the whole thing equally quietly and never spoke of it again. It's still in release notes, though.
It was like 2 years ago, so browsers could do that for 10-14 years (depending how you count).
For Firefox users, here's where it's hidden (and it really is hidden): Hamburger menu -> More tools -> Web developer tools, then keep clicking on the ">>" until the Network tab appears, then scroll over on about the third menu bar down until you see "No throttling", that's a combobox that lets you set the speed you want.
Alternatively, run uBlock Origin and NoScript and you probably won't need it.
Peanuts! My wife’s workplace has an internal photo gallery page. If your device can cope with it and you wait long enough, it’ll load about 14GB of images (so far). In practice, it will crawl along badly and eventually just crash your browser (or more), especially if you’re on a phone.
The single-line change of adding loading=lazy to the <img> elements wouldn’t fix everything, but it would make the page at least basically usable.
Reserve a huge share of the blame for the “UX dEsIgNeRs”. Let’s demand to reimplement every single standard widget in a way that has 50% odds of being accessible, has bugs, doesn’t work correctly with autofill most of the time, and adds 600kB of code per widget. Our precious branding requires it.
often we're told to add Google XSS-as-a-serv.. I mean Tag Manager, then the non-tech people in Marketing go ham without a care in the world beyond their metrics. Can't blame them, it's what they're measured on.
Marketing and managers should be restricted as well, because managers set the priorities.
You can still make a site unusable without having it load lots of data. Go to https://bunnings.com.au on a phone and try looking up an item. It's actually faster to walk around the store and find an employee and get them to look it up on an in-store terminal than it is to use their web site to find something. A quick visit to profiles.firefox.com indicates it's probably more memory than CPU, half a gigabyte of memory consumed if I'm interpreting the graphical bling correctly.
You don't even need video for this: I once worked for a company that put a carousel with everything in the product line, and every element was just pointing to the high resolution photography assets: The one that maybe would be useful for full page print media ads. 6000x4000 pngs. It worked fine in the office, they said. Add another nice background that size, a few more to have on the sides as you scroll down...
I was asked to look at the site when it was already live, and some VP of the parent company decided to visit the site from their phone at home.
Many web application frameworks already have extensive built-in optimization features, though examples like the one that you shared indicate that there are fundamentals that many people contributing to the modern web simply don't grasp or understand that these frameworks won't just 'catch you out' on in many cases. It speaks to an overreliance on the tools and a critical lack of understanding of the technologies that they co-exist with.
Music producers often have some shitty speakers known as grot boxes that they use to make sure their mix will sound as good as it can on consumer audio, not just on their extremely expensive studio monitors. Chromebooks are perfectly analogous. As a side note, today I learned that Grotbox is now an actual brand: https://grotbox.com
Should also give designers periodically small displays with low maximum contrast, and have them actually try to achieve everyday tasks with the UX they have designed.
If you want to see context aware pre-fetching done right go to mcmaster.com ...
There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3
Well as long as the website was already full loaded and responsive, and the videos show a thumbnail/placeholder, you are not blocked by that. Preloading and even very agressive pre-loading are a thing nowaadays. It is hostile to the user (because it draws their network traffic they pay for) but project managers will often override that to maximize gains from ad revenue.
this is a general problem with lots of development. Network, Memory, GPU Speed. Designer / Engineer is on a modern Mac with 16-64 gig of ram and fast internet. They never try how their code/design works on some low end Intel UHD 630 or whatever. Lots of developers making 8-13 layer blob backgrounds that runs at 60 for 120fps on their modern mac but at 5-10fps on the average person's PC because of 15x overdraw.
I used the text web (https://text.npr.org and the like) thru Lyx. Also, Usenet, Gopher, Gemini, some 16 KBPS opus streams, everything under 2.7 KBPS when my phone data plan was throttled and I was using it in tethering mode. Tons of sites did work, but Gopher://magical.fish ran really fast.
Bitlbee saved (and still saves) my ass with tons of the protocols available via IRC using nearly nil data to connect. Also you can connect with any IRC client since early 90's.
Not just web developers. Electron lovers should be trottled with 2GB of RAM machines and some older Celeron/Core Duo machine with a GL 2.1 compatible video card. It it desktop 'app' smooth on that machine, your project it's ready.
I'm pretty damn sure those videos were put on the page because someone in marketing wanted them. I'm pretty sure then QA complained the videos loaded too slowly, so the preloading was added. Then, the upper management responsible for the mess shrugged their shoulders and let it ship.
You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
> Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?
And the devs are responsible for finding a good technical solution under these constraints. If they can't, for communicating their constraints to the rest of the team so a better tradeoff can be found.
this isn't purely laundering blame. it is frustrating for the infrastructure/operations side is that the dev teams routinely kick the can down to them instead of documenting the performance/reliability weak points. in this case, when someone complains about the performance of the site, both dev and qa should have documented artifacts that explain this potential. as an infrastructure and reliability person, i am happy to support this effort with my own analysis. i am less inclined to support the dev team that just says, "hey, i delivered what they asked for, it's up to you to make it functional."
> From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
this belittles the intelligence of the dev team. they should know better. it's like validating saying "i really thought i could pour vodka in the fuel tank of this porsche and everything would function correctly. must be porsche's fault."
Fuck that. I just left a job where the IT dept just said "yes and" to the executives for 30 years. It was the most fucked environment I've ever seen, and that's saying a lot coming from the MSP space. Professionals get hired to do these things so they can say "No, that's a terrible idea" when people with no knowledge of the domain make requests. Your attitude is super toxic.
The devs are the subject matter experts. Does marketing understand the consequences of preloading all those videos? Does upper management? Unlikely. It’s the experts’ job to educate them. That’s part of the job as much as writing code is.
From the perspective of the devs, they have a responsibility for saying something literally wont fly anywhere, ever, saying the business is responsible for every bad decision is a complete abrogation of your responsibilities.
Just FYI how this generally works: it's not developers who add it, but non-technical people.
Developers only add a single `<script>` in the page, which loads Google Tag Manager, or similar monstrosity, at the request of someone high up in the company. Initially it loads ~nothing, so it's fine.
Over time, non-technical people slap as many advertising "partner" scripts they can in the config of GTM, straight to prod without telling developers, and without thinking twice about impact on loading times etc. All they track is $ earned on ads.
(It's sneaky because those scripts load async in background so it doesn't immediately feel like the website gets slower / more bloated. And of course, on a high end laptop the website feels "fine" compared to a cheap Android. Also, there's nothing developers can do about those requests, they're under full the control of all those 3rd-parties.)
Fun fact: "performance" in the parlance of adtech people means "ad campaign performance", not "website loading speed". ("What do you mean, performance decreased when we added more tracking?")
I tried to fight against the introduction of GTM in a project I worked on; we spent a lot of effort on coding, reviewing, testing, optimizing and minimizing client-side code before our end-users would see it, and the analytics people want a shortcut to inject any JS anywhere?
I didn't win that one, but I did make sure that it would only load after the user agreed to tracking cookies and the like.
Yeah, it’s really hard to compete with a solution that takes engineers out of the loop. The biggest reason large orgs go so crazy with GTM is that it’s a shadow deployment pipeline that doesn’t require waiting for engineers to work a request, or QA, or a standard release process.
And sure, better prioritization and cooperation with eng can make the “real” release processes work better for non-eng stakeholders, but “better” is never going to reach the level of “full autonomy to paste code to deploy via tag manager”.
This is the same reason why many big apps have a ton of Wordpress-managed pages thougout the product (not just marketing pages); often, that’s because the ownership and release process for the WP components is “edit a web UI” rather than “use git and run tests and have a test plan and schedule a PR into a release”.
Similar story here. I had to remind them multiple times, that the website was not conforming with the law, and explain multiple times, that the consent dialog was not implemented correctly, or point out, that stuff was loaded before consenting, etc. They mostly found it annoying, of course. And of course no one thanked me for saving the business from running into any complications with the law. As far as I know, I was the only one there pointing out the issues, as a backend dev, and even the frontend team was blissfully ignorant of the issues.
The good thing about the heavy use of GTM, is that its easy to block. Just block that one endpoint and you remove most of the advertising and tracking. When some new advertising service is invented, its already blocked thanks to the blocking of GTM.
Developers do that as well. Especially now with llm-assisted coding. Accept half-baked solution and go to the next ticket.
I've had recently a case at work, while filling a contact form to add a new party there were 300+ calls to the validation service to validate email and phones. Three calls per every character entered to every text input!
Yeah, never allow non-technical people to put something like google tags manager on the business' website, that can load arbitrary other stuff. The moment this is pushed through, against engineering's advice, distancing yourself from the cesspool, that the website will inevitably become sooner or later, is the healthy choice. It is difficult to uphold the dam, against wishes of other departments, like marketing and sales, and it takes an informed and ethically aware engineering department lead, who upholds principles and remains steadfast. Rare.
GDPR-compliance is the first thing that goes out of the window, and with that conforming to the law, when in the EU. Ethics fly out of the window at the same time, or just slightly afterwards, when they add tracking, that no one agreed to, or when they forget to ask for consent, or when they have a "consent" popup, that employs dark pattern, or when they outsource consent to a third party tool, that informed visitors don't want anything to do with.
Author here. Woke up in today to see this on the front page, thank you to the person who submitted it! Initially, my biggest fear was the HN "Hug of Death" taking it down. Happily, Cloudflare's edge caching absorbed 19.24 GB of bandwidth in a few hours with a 98.5% cache hit ratio, so the origin server barely noticed.
The discussions here about DNS-level blocking and Pi-hole are spot on. It's interesting that the burden of a clean reading experience is slowly being offloaded to the user's network stack.
Sure thing! I don't have the exact instantaneous peak since Cloudflare groups historical data by the hour on the free tier, but the peak 60 minutes last night saw 70,100 requests. That averages out to about 20 requests per second sustained over the hour. Wish I could be more granular but hope that helps a little.
I have done minor experiments with disabling javascript, it works most publications are far more readable with javascript disabled, you miss carousels and some interactive elements but overall a much better experience.
These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.
Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.
My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).
I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)
Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.
Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
In the past some site had light versions, but I haven’t come across one in over 10 years
Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it
It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that from me in a year
Sadly, here’s how it would go: they’d do it, it be successful, they’d ipo, after a few years they’d need growth, they’d introduce a new tier with ads, and eventually you’d somehow wind up watching ads again
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.
Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.
>Visitors like you are a tiny minority who were never going to contribute revenue anyway.
It's closer to 30% that block ads. For subscription conversion, it's under 1%.
It's a large reason why the situation is so bad. But the internet is full of children, even grown children now in their 40's, who desperately still cling to this teenage idea that ad blocking will save the internet.
That's why we need to spread the word and get more people using adblockers. It's not even a hard sell - the difference is so striking, once it has been seen, it sells itself, even for the most casual users.
"Why would you feel guilty for not visiting a site you’re not paying for and where you’re blocking ads?"
This isn't a simple as it sounds, in fact it's rather complicated (far too involved to cover in depth here).
In short, ethics are involved (and believe it or not I actually possess some)!
In the hayday of newsprint people actually bought newspapers at a cheap affordable price and the bulk of their production was paid for by advertisements. We readers mostly paid for what we read, newspapers were profitable and much journalism was of fair to good quality. Back then, I had no qualms about forking out a few cents for a copy of the NYT.
Come the internet the paradigm changed and we all know what happened next. In fact, I feel sorry about the demise of newsprint because what's replaced it is of significantly lesser value.
In principle I've no objection to paying for news but I will not do so for junk and ads that I cannot avoid (with magazines and newspapers ads are far less intrusive).
So what's the solution? It's difficult but I reckon there are a few worth considering. For example, I mentioned some while ago on HN that making micro payments to websites ought to be MUCH easier than it is now (this would apply to all websites and would also be a huge boon for open source developers).
What I had in mind was an anonymous "credit" card system with no strings attached. Go to your local supermarket, kiosk or whatever and purchase a scratchy card with a unique number to say the value of $50 for cash and use that card to make very small payments to websites. Just enter the card's number and the transaction is done (only enter one's details if purchasing something that has to be delivered).
That way both the card and user remain anonymous if the user wishes, also one's privacy is preserved, etc. It could be implemented by blockchain or such.
The technical issues are simple but
problems are obvious—and they're all political. Governments would go berserk and cry money laundering, tax evasion, criminal activity, etc., and the middlemen such as Master and Visa cards would scream to high heaven that their monopolies were being undercut.
In short, my proposal is essentially parallels what now exits with cash—I go to a supermarket and pay cash for groceries, the store doesn't need to know who I am. It ought to be no big deal but it isn't.
It seems to me a very simple micro payments system without name, rank and serial number attached would solve many of the internet payment problems.
Sure, there'll always be hardline scavengers and scrapers but many people would be only too happy to pay a little amount for a service they wanted, especially so when they knew the money was going into producing better products.
For example, I'd dearly love to be able to say purchase a copy of LibreOffice for $10 - $20 and know there was enough money in the organisation to develop the product to be fully on par with MSO.
Trouble is when buying stuff on the internet there's a minimum barrier to overcome and it's too high for most people when it comes to making micro payments (especially when the numbers could run into the hundreds per week).
I cannot understand why those who'd benefit from such a scheme haven't at least attempted to push the matter.
Something about these JS-heavy sites I haven't seen discussed: They don't archive well.
Websites that load a big JS bundle, then use that to fetch the actual page content don't get archived properly by The Wayback Machine. That might not be a problem for corporate content, but lots of interesting content has already been lost to time because of this.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
No.
"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.
Do what you have to do, but no one at the NYT is losing any sleep over people like us.
"…but no one at the NYT is losing any sleep over people like us."
Likely not, but they are over their lost revenues. The profitability of newspapers and magazines has been slashed to ribbons over the past couple of decades and internet revenues hardly nudge the graphs.
Internet beneficiaries are all new players, Google et al.
> We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
Where do you trust to read the news? Any newsrooms well staffed enough to verify stories (and not just reprint hearsay) seem to have the same issues.
The AP and Reuters are well-staffed and have functional websites. The sites aren’t great (they’ve been afflicted with bloat and advertising along with most outlets, just at a marginally lower rate), but they are at least usable.
I don't understand all these sites with moving parts even with muted soon, like if everything was a collection of GIFs. NYT followed this path and started to insert muted clips preheminently on their page one, very very annoying.
Do you think youtube will continue to make it possible to use alternate clients, or eventually go the way of e.g. Netflix with DRM so you're forced to use their client and watch ads?
They are also not averse to using legal means to block them. For example, back when Microsoft shipped Windows Phone, Google refused to make an official YouTube client for it, so Microsoft hacked together its own. Google forced them to remove it from the store: https://www.windowscentral.com/google-microsoft-remove-youtu...
If Google were just starting YouTube today then DRM would likely be enforced through a dedicated app. The trouble for Google is that millions watch YouTube through web browsers many of whom aren't even using a Google account let alone even being subscribers to a particular YouTube page. Viewership would drop dramatically.
Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.
The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.
Big tech will slowly enforce "secure browsing" and "secure OS" in a way that will make it impossible to browse the web without a signed executable approved by them. DRM is just a temporary stopgap.
What does playing fair mean in this context? It would be one thing if you were a paid subscriber complaining that even paying sucks so you left, but it sounds like you’re not.
It is strange to hear these threats about avoiding websites from people who are not subscribers and also definitely using an ad blocker.
News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.
I am a paid subscriber to NYT and have been reading it paper / internet for 30+ years. It is an Enshittification winner in terms tracking and click bait. It doesn't feel like a serious news outlet anymore, feels like Huff Post or similar.
I'd like to answer that in detail but it's impractical to do so here as it'd take pages. As a starter though begin with them not violating users' privacy.
Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.
You're right, it means nothing. But it cuts two ways. These sites are sending me bytes and I choose which bytes I visualize (via an ad blocker). Any expectation the website has about how I consume the content has no meaning and it's entirely their problem.
The NYT is comically bad. Most of their (paywalled) articles include the full text in a JSON blob, and that text is typically 2-4% of the HTML. Most of the other 96-98% is ads and tracking. If you allow those to do their thing, you're looking at probably two orders of magnitude more overhead.
My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.
The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.
The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.
The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
> The fundamental problem of journalism is that the economics no longer works out.
Yes it does, from nytimes actual earning release for Q 2025:
1. The Company added approximately 450,000 net digital-only subscribers compared with the end of the third quarter of 2025, bringing the total number of subscribers to 12.78 million.
2. Total digital-only average revenue per user (“ARPU”) increased 0.7 percent year-over-year to $9.72
2025 subscription revenue was 1.950 billion dollars. Advertising was 565 million that includes 155 million dollars worth of print advertising.
Sure operating profit is only 550 million very close to the advertising revenue, but the bulk of their income is subscriptions, they could make it work if they had to. My suspicion is that if they dropped all the google ads they could have better subscription retention and conversion rates as well.
I remember getting punishment from parents for downloading 120MB World of Tanks update over metered home internet. Our monthly quota was 250MB. It was not that long ago, 2010.
In the late '90s, a friend recommended I download some freeware from a website. It was 1.2MB. I told him "are you crazy? 1.2MB? It's gonna take a whole week!"
A lot of free government phone plans supplied to homeless, parolees etc in the USA only come with 3GB of transfer credit, which is usually burned up in about 3 days, leaving them without any Internet access. (or sometimes it'll drop to a throttled connection that is so slow that it can never even load Google Maps)
I just loaded the nytimes.com page as an experiment. The volume of tracking pixels and other ad non-sense is truly horrifying.
But at least in terms of the headline metric of bandwidth, it's somewhat less horrifying. With my ad-blocker off, Firefox showed 44.47mb transferred. Of that 36.30mb was mp4 videos. These videos were journalistic in nature (they were not ads).
So, yes in general, this is like the Hindenburg of web pages. But I still think it's worth noting that 80% of that headline bandwidth is videos, which is just part of the site's content. One could argue that it is too video heavy, but that's an editorial issue, not an engineering issue.
Why are we supposed to think it's normal to see videos on every page? Even where it's directly relevant to the current page, what's the justification in thrusting those 36.30mb on the user before they explicitly click play?
It’s a news site with a lot of auto-playing video. If you like that kind of content, great. If not, there’s lots of other websites with different mixes of content. I subscribe to the economist which has few videos and they never auto play.
But that’s a question of taste. 5mb of JavaScript and hundreds of tracking assets is not.
I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.
While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
I would love for someone more knowledgeable in this space than I to chime in on the economics of this industry
Are the few cents you get from antagonizing users really worth it?
I suspect the answer is simple and that most users don’t give a shit
I think it has to do a lot with when you came of age - I’m in my late 30s, I got my first tech job at 14 as a sys admin for a large school district, and every single developer, admin, etc that I knew was already going on about the free internet. As a result, I’ve never had a tolerance for anything but the most reasonable advertisements
I think that ideology is necessary to care enough and be motivated enough to really get rid of ads, how fucking awful the websites are alone should be enough but for most people it isn’t
Not only are loading times and total network usage ridiculous, sites will continue to violate your privacy via trackers and waste your CPU even when background idling. I've written about these issues a few times in the last few years, so just sharing for those interested:
Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed.
Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled.
It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust
Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system.
Especially important for banking sites and webmail.
Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust.
JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).
Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.
Why can't MY browser send some random JS to THEIR website? If it's safe for me to run some stranger's code, should it be safe for strangers to run my code?
Disable not just JavaScript, but also CSS. I'm not kidding. Many websites actually contain all the content in HTML but use CSS to hide and then use JavaScript to show it again.
There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages.
The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.
This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times.
Give me back the command line terminal like Amadeus, that would be perfect.
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:
- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.
- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.
- date picker is randomly in Chinese
- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.
- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.
Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.
Modern web dev is ridiculous. Most websites are an ad ridden tracking hellacape. Seeing sites like hn where lines of js are taken seriously is a godsend. Make the web less bloated.
What really blows my mind is how unusable it is. How often do you visit a web site and there’s so much crap overlaid on the page that you can barely see the actual content? Surely that can’t be good for their ability to make money, yet they persist.
I started writing in a Dioxus (rust framework) style. max 1KB of js code. Sending the diff via WebSocket from the rust server, and , what is more important, all code is now on a server, and because of websocket and rust it executes almost same speed as on the client. Back to normal pages sizes. And, of course, virtual scrolling everywhere.
This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
What on earth do you have to rely on alphabet, an ad company, to read rss for? there are many other options, that are not made by an ad company.
Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.
Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D
Yeah I wonder why gmail was not one of the shut down products /s
I remember in 2008, when Wizards of the Coast re-launched the official Dungeons & Dragons website to coincide with the announcement of the fourth edition rules. The site was something in the region of 4 MB, plus a 20 MB embedded video file. A huge number of people were refreshing the site to see what the announcement was, and it was completely slammed. Nobody could watch the trailer until they uploaded it to YouTube later.
4 MB was an absurd size for a website in 2008. It's still an absurd size for a website.
> users are greeted by what I call Z-Index Warfare
Nice term!
> Or better yet, inject the newsletter signup as a styled, non-intrusive div between paragraphs 4 and 5. If the user has scrolled that far, they are engaged.
They're engaged with the content! There is no way to make some irrelevant signup "non-intrusive". It's similar to links to unrelated articles - do you want users to actually read the article or jump around reading headlines?
Z-index warfare is so stupid (the practice, not the name). It's not only with media publications, but ecommerce sites as well.
There have been countless businesses that have lost out on my money because I clicked on their ad (good job!), start reading the product info on their website as it loads (that's basically a sale), and then the page finished loading with a barrage of popups for cookie consent, newsletter signup for a discount code, special offers, special sale, spin the wheel for a prize, etc. That's when I close the tab and forget about it.
This rubbish also exists disproportionately for recipe pages/cooking websites as well.
You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.
It's second nature to open all these websites in reader mode for me atp.
You want to know why so many people either jump straight to comments or use alternate sources (archive, llms)? Because if you load the actual site, it freaking blows to use the damn thing.
So much hostile user design.
Edit: NPR gets a little shout out for being able to close their annoying pop-ups by clicking anywhere that's not the notification. So it's still crappy that it hijacks the screen, but not awful I guess?
It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
There are plenty of web devs who care about performance and engineering quality. But caring about such things when you work on something like a news site is impossible: These sites make their money through user tracking, and it's literally your job to stuff in as many 3rd-party trackers as management tells you to. Any dev who says no on the basis that it'll slow the site down will get fired as quickly as a chef who get a shift job in McDonalds and tries to argue for better cuisine.
It's almost criminal that the article does not mention network-wide DNS blocklists as an obvious solution to this problem. I stop nearly 100% of ads in their tracks using the Hagezi ultimate list, and run uBlock on desktop for cosmetic filtering and YouTube.
I should really run some to tests to figure out how much lighter the load on my link is thanks to the filter.
I also manually added some additional domains (mostly fonts by Google and Adobe) to further reduce load and improve privacy.
It does indeed work pretty well today, but they have already developed ways to circumvent it. For example, serving ads from the same domain as the main page.
It would be less hypocritical if that critique of the situation wasn't posted on a website that itself loads unnecessary 3rd party resources (e.g. cloudflare insights).
Luckily I use a proper content blocker (uBlock Origin in hard mode).
One of the things I don't get is the economics of these trackers.
Someone is serving this amount of data to every visitor. Even if you want to track the user as much as possible, wouldn't it make sense to figure out how to do that with the least amount of data transfer possible as that would dramatically reduce your operating cost?
Perhaps size optimization is the next frontier for these trackers.
Traffic for the static payload is super cheap. And the insane amount of requests is handled easily by modern event-based architectures. The operation costs are most likely only a tiny amount of the overall economics of the tracker's buisness model. The generated tracking data is certainly worth an order of magnitude more then it takes to generate it.
The newest thing is “please wait a minute while Cloudflare decides you’re not a bot.” So you sit through that, then deal with the GPDR banner, then you get to watch an ad.
I was thinking about creating charts of shame for this across some sites. Is there some browser extension that categorizes the data sources and requests like in a pie chart or table? Tracking, ad media, first party site content...? Would be nice with a piled bar chart with piles colorized by data category.
Maybe you'd need one chart for request counts (to make tracking stand out more) and another for amount of transferred data.
I started on this when project when I was at The New Yorker. I had just manage to convince people to give us space to do web performance optimization - and then we had to drop it quickly to work on AMP. Very frustrating.
This site was created to give developers and pms some ammunition to work on improving load speed
I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.
If people tune out only because how horrible the sites are, good.
I think that unless one is well-connected or is willing to pay significant money, time, and/or effort to obtain high-quality information, one will still generally get more accurate information about the world by reading between the lines of the propaganda than one would by not reading the propaganda at all.
rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAM
they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
Even enterprise COTS products can have some of these issues. We have an on-premise Atlassian suite, and Jira pages sometimes have upwards of 30MB total payloads for loading a simple user story page — and keep in mind there is no ad-tech or other nonsense going on here, it’s just pure page content.
>I don't know where this fascination with getting everyone to download your app comes from.
So they could do exactly what they are doing on the web and may be even more but with Native code so it feels much faster.
I got to the point and wonder why cant all the tracking companies and ad network just all share and use the same library.
But on Web page bloat. Let's not forget Apps are insanely large as well. 300 - 700MB for Banking, Traveling or other Shopping App. Even if you cut 100MB on L10n they are still large just because of again tracking and other things.
I was really surprised when I went to book a flight on Frontier (don't judge me!) and a request from analytics.tiktok.com loaded. I have a lot of discomfort about that. Bloat and surveillance go hand in hand.
Oh yeah, that old topic. We’ve already discussed this back when text-heavy websites started reaching megabyte sizes. So I’m going to go look for the posts in this thread that try to explain and defend that. I’m especially looking forward to the discussions about whether ad blocking is theft or morally reprehensible. If those are still around.
This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.
And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
I think there'll continue to be growth in page sizes, but then maybe we'll consider efficiency, or the NYTimes shuts down and the 20MB page will be the liquidators selling the domain. Maybe we don't even use domains by then as everything is on an app.
Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
The same phenomenon worsened during the DotCom Meltdown and the Great Financial Crisis. This accelerated desperation is a sign of the times; paying subscribers are likely cancelling due to current economic conditions.
Bit unfair, turned off my adblocker and ran NY Times website with cache disabled via Dev Tools, came to 3MB. Still pretty damn high but not 49MB. (Will say I'm in the UK so might be different across the pond).
"The Sticky Video Player
Publishers love embedding auto-playing videos these days, which isn't really popular. You'll find mulitple forum, Reddit, HN, or Twitter threads about it.
To make it somehow worse...when you scroll down, you think it would leave you as it leaves the viewport. No. It detaches, shrinks and pins itself to the bottom right of your screen and continues playing. It keeps the distraction going and as if teasing you, features a microscopic 'X' button with a tiny hit area (violating Fitts's Law)."
Is there not way to stop this? The do not autoplay videos option often does not work.
> The user must perform visual triage, identify the close icons (which are deliberately given low contrast) and execute side quests just to access the 5KB of text they came for.
They thing is, though... they _don't_ have to. It's been my standard practice for years to just tap ctrl-w the moment any web page pops up a model box. Some leeway is given to cookie dialogs _if_ they have a disagree/disable button _prominently_ visible, otherwise they're ctrl-w'd too.
I recently had to switch to a Japanese LINE account to gain access to certain features. I had no idea how good I had it on my American LINE account. The Japanese account is covered with ads EVERYWHERE on the home screen and even in the chat area. I have no idea how this app is still popular in Japan. I would pay to remove the ads if I could.
They also have their own tiktok and AI slop that I never knew about.
Another reason for this is that often the non-tech people can inject third party scripts via CMS, GTM and so on. I remember once we had a large drop in indexed pages on Google and it turned out that a script had moved our entire site into an iframe. The marketing people who injected it was like "it is just a script".
I worked at big newspapers as a software engineer. Please do not blame the engineers for this mess. As the article says news is in a predicament because of the ads business model. Subscriptions alone usually cannot cover all costs and ads will invariably make their way in.
For every 1 engineer it seems like there are 5 PMs who need to improve KPIs somehow and thus decide auto playing video will improve metrics. It does. It also makes people hate using your website.
I would constantly try to push back against the bullshit they'd put on the page but no one really cares what a random engineer thinks.
I don't think there's any real way to solve this unless we either get less intrusive ad tech or news gets a better business model. Many sites don't even try with new business models, like local classifieds or local job boards. And good luck getting PMs to listen to an engineer talking about these things.
The sad thing is, this is already a paywalled site.
I’m afraid someone who wants to support professional journalism and agrees to pay ~$300/yr for an NYT subscription still gets most (all?) of this nonsense?
It's certainly one of the reasons why I ended my NY Times subscription in 2024, and split that money between recurring donations to public media, Archive.org, and the EFF.
It's my device. I decide what I download, execute and display on my device. A website is free to offer me to download an ad and I am free to decline that offer. Demanding me to download anything on my device or even worse execute someone else's programs [JS] and claiming that I have a moral obligation to do so is deeply creepy.
Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.
I'm pretty sure people would read more and click on more ads if they didn't have to endure waiting for 49 MB of crap and then navigating a pop-up obstacle course for each article.
If the ad-tech sausage factory needs 49MB of JS for a clickbait article, that is not "earning" a living. They are just externalizing costs to users and ISPs. You can defend the hustle, but the scale of waste here is cartoonish.
If you need a CDN and half a browser runtime just to show 800 words about celebrity nonsense, the business model is broken. Everyone else is footing the bandwidth bill for nonsense they never asked to recieve.
This is what killed my willingness to subscribe to most outlets. If I'm paying, I expect the page to load in under a second with zero tracking. Instead you get the same bloated experience minus a banner ad or two.
This argument is valid if journalism was actually journalism instead of just ripping off trending stories from HN and Reddit and rehashing it with sloppy AI and calling it a day and putting in 4 lines of text buried inside 400 ads.
I don't like the state of journalism either but you realize this is a vicious cycle, no? People not paying for news (by buying newspaper, or more importantly paying for classified ads) leading to low quality online reporting leading to people not wanting to pay for online news.
Our developers managed to run around 750MB per website open once.
They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.
We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.
Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
16 replies →
I still test mine on GPRS, because my website should work fine in the Berlin U-Bahn. I also spent a lot of time working from hotels and busses with bad internet, so I care about that stuff.
Developers really ought to test such things better.
2 replies →
It doesn't throttle Websockets, so be careful with that
For macOS users you can download the Network Link Conditioner preference pane (it still works in the System Settings app) to do this system wide. I think it's in the "Additional Tools for Xcode" download.
This made me chuckle.
I had a fairly large supplier that was so proud that they implemented a functionality that deliberately (in their JS) slows down reactions from http responses. So that they can showcase all the UI touches like progress bars and spinning circles. It was an option in system settings you could turn on globally.
My mind was blown, are they not aware of F12 in any major browser? They were not, it seems. After I quietly asked about that, they removed the whole thing equally quietly and never spoke of it again. It's still in release notes, though.
It was like 2 years ago, so browsers could do that for 10-14 years (depending how you count).
1 reply →
For Firefox users, here's where it's hidden (and it really is hidden): Hamburger menu -> More tools -> Web developer tools, then keep clicking on the ">>" until the Network tab appears, then scroll over on about the third menu bar down until you see "No throttling", that's a combobox that lets you set the speed you want.
Alternatively, run uBlock Origin and NoScript and you probably won't need it.
2 replies →
Peanuts! My wife’s workplace has an internal photo gallery page. If your device can cope with it and you wait long enough, it’ll load about 14GB of images (so far). In practice, it will crawl along badly and eventually just crash your browser (or more), especially if you’re on a phone.
The single-line change of adding loading=lazy to the <img> elements wouldn’t fix everything, but it would make the page at least basically usable.
Amazing. Well, any employee that wants more ram could use that internal site as an excuse.
"Why do you want 64 GB RAM in your laptop?"
"I need that to load the gallery"
Haha excellent. Presumably all the images are the full res haven’t been scaled down for the web at all?
1 reply →
> We really shouldn't allow web developers more than 128kbit
Marketing dept. too. They're the primary culprits in all the tracking scripts.
Reserve a huge share of the blame for the “UX dEsIgNeRs”. Let’s demand to reimplement every single standard widget in a way that has 50% odds of being accessible, has bugs, doesn’t work correctly with autofill most of the time, and adds 600kB of code per widget. Our precious branding requires it.
5 replies →
often we're told to add Google XSS-as-a-serv.. I mean Tag Manager, then the non-tech people in Marketing go ham without a care in the world beyond their metrics. Can't blame them, it's what they're measured on.
Marketing and managers should be restricted as well, because managers set the priorities.
1 reply →
You can still make a site unusable without having it load lots of data. Go to https://bunnings.com.au on a phone and try looking up an item. It's actually faster to walk around the store and find an employee and get them to look it up on an in-store terminal than it is to use their web site to find something. A quick visit to profiles.firefox.com indicates it's probably more memory than CPU, half a gigabyte of memory consumed if I'm interpreting the graphical bling correctly.
1 reply →
You don't even need video for this: I once worked for a company that put a carousel with everything in the product line, and every element was just pointing to the high resolution photography assets: The one that maybe would be useful for full page print media ads. 6000x4000 pngs. It worked fine in the office, they said. Add another nice background that size, a few more to have on the sides as you scroll down...
I was asked to look at the site when it was already live, and some VP of the parent company decided to visit the site from their phone at home.
Many web application frameworks already have extensive built-in optimization features, though examples like the one that you shared indicate that there are fundamentals that many people contributing to the modern web simply don't grasp or understand that these frameworks won't just 'catch you out' on in many cases. It speaks to an overreliance on the tools and a critical lack of understanding of the technologies that they co-exist with.
There's essentially zero chance the developers get to make choices about the ads and ad tracking.
I wouldn't even guarantee it's developers adding it. I'm sure they have some sort of content management system for doing article and ad layout.
Same for fancy computers. Dev on a fast one if you like, but test things out on a Chromebook.
“Craptop duty”[1]. (Third time in three years I’m posting an essentially identical comment, hah.)
[1] https://css-tricks.com/test-your-product-on-a-crappy-laptop/
5 replies →
Music producers often have some shitty speakers known as grot boxes that they use to make sure their mix will sound as good as it can on consumer audio, not just on their extremely expensive studio monitors. Chromebooks are perfectly analogous. As a side note, today I learned that Grotbox is now an actual brand: https://grotbox.com
3 replies →
Based on the damage rate for company laptop screens, one can usually be sure anything high-end will be out of your own pocket. =3
Should also give designers periodically small displays with low maximum contrast, and have them actually try to achieve everyday tasks with the UX they have designed.
Yes, and a machine that is at least two generations behind the latest. That will cut down on bloat significantly.
If you want to see context aware pre-fetching done right go to mcmaster.com ...
There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3
And this! https://www.mcmaster.com/help/api/ Linked from the footer of every page!
I'm so happy to have seen their web site that I want to do business with them, even though I have no business to be done.
2 replies →
[dead]
Well as long as the website was already full loaded and responsive, and the videos show a thumbnail/placeholder, you are not blocked by that. Preloading and even very agressive pre-loading are a thing nowaadays. It is hostile to the user (because it draws their network traffic they pay for) but project managers will often override that to maximize gains from ad revenue.
this is a general problem with lots of development. Network, Memory, GPU Speed. Designer / Engineer is on a modern Mac with 16-64 gig of ram and fast internet. They never try how their code/design works on some low end Intel UHD 630 or whatever. Lots of developers making 8-13 layer blob backgrounds that runs at 60 for 120fps on their modern mac but at 5-10fps on the average person's PC because of 15x overdraw.
I used the text web (https://text.npr.org and the like) thru Lyx. Also, Usenet, Gopher, Gemini, some 16 KBPS opus streams, everything under 2.7 KBPS when my phone data plan was throttled and I was using it in tethering mode. Tons of sites did work, but Gopher://magical.fish ran really fast.
Bitlbee saved (and still saves) my ass with tons of the protocols available via IRC using nearly nil data to connect. Also you can connect with any IRC client since early 90's.
Not just web developers. Electron lovers should be trottled with 2GB of RAM machines and some older Celeron/Core Duo machine with a GL 2.1 compatible video card. It it desktop 'app' smooth on that machine, your project it's ready.
I'm pretty damn sure those videos were put on the page because someone in marketing wanted them. I'm pretty sure then QA complained the videos loaded too slowly, so the preloading was added. Then, the upper management responsible for the mess shrugged their shoulders and let it ship.
You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
> Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?
7 replies →
And the devs are responsible for finding a good technical solution under these constraints. If they can't, for communicating their constraints to the rest of the team so a better tradeoff can be found.
this isn't purely laundering blame. it is frustrating for the infrastructure/operations side is that the dev teams routinely kick the can down to them instead of documenting the performance/reliability weak points. in this case, when someone complains about the performance of the site, both dev and qa should have documented artifacts that explain this potential. as an infrastructure and reliability person, i am happy to support this effort with my own analysis. i am less inclined to support the dev team that just says, "hey, i delivered what they asked for, it's up to you to make it functional."
> From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
this belittles the intelligence of the dev team. they should know better. it's like validating saying "i really thought i could pour vodka in the fuel tank of this porsche and everything would function correctly. must be porsche's fault."
2 replies →
"Developers" here clearly refers to the entire organization responsible. The internal politics of the foo.com providers are not relevant to Foo users.
1 reply →
Fuck that. I just left a job where the IT dept just said "yes and" to the executives for 30 years. It was the most fucked environment I've ever seen, and that's saying a lot coming from the MSP space. Professionals get hired to do these things so they can say "No, that's a terrible idea" when people with no knowledge of the domain make requests. Your attitude is super toxic.
3 replies →
Sounds just like a "helpless" dev that shifts blame to anyone but themselves.
9 replies →
The devs are the subject matter experts. Does marketing understand the consequences of preloading all those videos? Does upper management? Unlikely. It’s the experts’ job to educate them. That’s part of the job as much as writing code is.
In general, how people communicate internally and with the public is important.
https://en.wikipedia.org/wiki/Conway's_law
Have a wonderful day =3
From the perspective of the devs, they have a responsibility for saying something literally wont fly anywhere, ever, saying the business is responsible for every bad decision is a complete abrogation of your responsibilities.
7 replies →
> 422 network requests and 49 megabytes of data
Just FYI how this generally works: it's not developers who add it, but non-technical people.
Developers only add a single `<script>` in the page, which loads Google Tag Manager, or similar monstrosity, at the request of someone high up in the company. Initially it loads ~nothing, so it's fine.
Over time, non-technical people slap as many advertising "partner" scripts they can in the config of GTM, straight to prod without telling developers, and without thinking twice about impact on loading times etc. All they track is $ earned on ads.
(It's sneaky because those scripts load async in background so it doesn't immediately feel like the website gets slower / more bloated. And of course, on a high end laptop the website feels "fine" compared to a cheap Android. Also, there's nothing developers can do about those requests, they're under full the control of all those 3rd-parties.)
Fun fact: "performance" in the parlance of adtech people means "ad campaign performance", not "website loading speed". ("What do you mean, performance decreased when we added more tracking?")
I tried to fight against the introduction of GTM in a project I worked on; we spent a lot of effort on coding, reviewing, testing, optimizing and minimizing client-side code before our end-users would see it, and the analytics people want a shortcut to inject any JS anywhere?
I didn't win that one, but I did make sure that it would only load after the user agreed to tracking cookies and the like.
Yeah, it’s really hard to compete with a solution that takes engineers out of the loop. The biggest reason large orgs go so crazy with GTM is that it’s a shadow deployment pipeline that doesn’t require waiting for engineers to work a request, or QA, or a standard release process.
And sure, better prioritization and cooperation with eng can make the “real” release processes work better for non-eng stakeholders, but “better” is never going to reach the level of “full autonomy to paste code to deploy via tag manager”.
This is the same reason why many big apps have a ton of Wordpress-managed pages thougout the product (not just marketing pages); often, that’s because the ownership and release process for the WP components is “edit a web UI” rather than “use git and run tests and have a test plan and schedule a PR into a release”.
Similar story here. I had to remind them multiple times, that the website was not conforming with the law, and explain multiple times, that the consent dialog was not implemented correctly, or point out, that stuff was loaded before consenting, etc. They mostly found it annoying, of course. And of course no one thanked me for saving the business from running into any complications with the law. As far as I know, I was the only one there pointing out the issues, as a backend dev, and even the frontend team was blissfully ignorant of the issues.
The good thing about the heavy use of GTM, is that its easy to block. Just block that one endpoint and you remove most of the advertising and tracking. When some new advertising service is invented, its already blocked thanks to the blocking of GTM.
Developers do that as well. Especially now with llm-assisted coding. Accept half-baked solution and go to the next ticket.
I've had recently a case at work, while filling a contact form to add a new party there were 300+ calls to the validation service to validate email and phones. Three calls per every character entered to every text input!
Yeah, never allow non-technical people to put something like google tags manager on the business' website, that can load arbitrary other stuff. The moment this is pushed through, against engineering's advice, distancing yourself from the cesspool, that the website will inevitably become sooner or later, is the healthy choice. It is difficult to uphold the dam, against wishes of other departments, like marketing and sales, and it takes an informed and ethically aware engineering department lead, who upholds principles and remains steadfast. Rare.
GDPR-compliance is the first thing that goes out of the window, and with that conforming to the law, when in the EU. Ethics fly out of the window at the same time, or just slightly afterwards, when they add tracking, that no one agreed to, or when they forget to ask for consent, or when they have a "consent" popup, that employs dark pattern, or when they outsource consent to a third party tool, that informed visitors don't want anything to do with.
Author here. Woke up in today to see this on the front page, thank you to the person who submitted it! Initially, my biggest fear was the HN "Hug of Death" taking it down. Happily, Cloudflare's edge caching absorbed 19.24 GB of bandwidth in a few hours with a 98.5% cache hit ratio, so the origin server barely noticed.
The discussions here about DNS-level blocking and Pi-hole are spot on. It's interesting that the burden of a clean reading experience is slowly being offloaded to the user's network stack.
Can you add to clean reading experience by exchanging the unreadable font (Outfit Thin) with one that is readable?
Out of curiosity, do you have (and want to share) stats about requests per second? It's always nice to know these things for future reference.
No worries if not :)
Sure thing! I don't have the exact instantaneous peak since Cloudflare groups historical data by the hour on the free tier, but the peak 60 minutes last night saw 70,100 requests. That averages out to about 20 requests per second sustained over the hour. Wish I could be more granular but hope that helps a little.
Not the author, but last year I wrote about my experience being on #1 on HN [0]. I created a visualization of the requests hitting my server.
[0]: https://idiallo.com/blog/surviving-the-hug-of-death
2 replies →
Just wanted to say that article is so much deeper than it seems from the title, and also beautifully written. It was a great read!
I have done minor experiments with disabling javascript, it works most publications are far more readable with javascript disabled, you miss carousels and some interactive elements but overall a much better experience.
These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.
Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.
My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).
I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)
Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.
Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
It’s hard to beat https://lite.cnn.com and https://text.npr.org (I imagine their own employees likely use these as well) or https://newsminimalist.com
Love both of them. CNN has become a bit "left-leaning Fox News" for my taste, though.
If Al Jazeera or BBC had a similar text only site, that would be best. I really love the different perspectives.
I mostly use brutalist.report to find the articles, then deal with them on a case by case basis.
2 replies →
I also love https://www.cbc.ca/lite/news
They also compress the hell out of the images, so it all loads shockingly well on poor connections.
Ahh, I love them. The fact that they are fast, give you the exact thing you are looking for without any other noise is just amazing!
https://lite.cnn.com seems to load 200KB of CSS
5 replies →
I’m honestly dumbfounded that these exist
In the past some site had light versions, but I haven’t come across one in over 10 years
Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it
It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that from me in a year
Sadly, here’s how it would go: they’d do it, it be successful, they’d ipo, after a few years they’d need growth, they’d introduce a new tier with ads, and eventually you’d somehow wind up watching ads again
2 replies →
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.
Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.
> They also know that web surfers like you would never actually buy a subscription
I subscribe, and yet they still bombard me with ads. Fuck that. One reason I don’t use apps is that I can’t block ads.
>Visitors like you are a tiny minority who were never going to contribute revenue anyway.
It's closer to 30% that block ads. For subscription conversion, it's under 1%.
It's a large reason why the situation is so bad. But the internet is full of children, even grown children now in their 40's, who desperately still cling to this teenage idea that ad blocking will save the internet.
> They know this. They also know that web surfers like you would never actually buy a subscription ..
That's not true I had a subscription for multiple years. I canceled it because they
A. Kept trying to show me bullshit ads, B. The overall deterioration of the quality of the content especially the opinion section.
That's why we need to spread the word and get more people using adblockers. It's not even a hard sell - the difference is so striking, once it has been seen, it sells itself, even for the most casual users.
I'm about to go full cycle.
For a while it looked like companies were going to offer a good product at a fair price. I started getting a few subscriptions to various services.
Then all of those services got enshitefied. I got ads in paid accounts, slow loads, obvious data mining, etc.
Paying for services now often offers a degraded experience relative to less legitimate methods of acces.
1 reply →
"Why would you feel guilty for not visiting a site you’re not paying for and where you’re blocking ads?"
This isn't a simple as it sounds, in fact it's rather complicated (far too involved to cover in depth here).
In short, ethics are involved (and believe it or not I actually possess some)!
In the hayday of newsprint people actually bought newspapers at a cheap affordable price and the bulk of their production was paid for by advertisements. We readers mostly paid for what we read, newspapers were profitable and much journalism was of fair to good quality. Back then, I had no qualms about forking out a few cents for a copy of the NYT.
Come the internet the paradigm changed and we all know what happened next. In fact, I feel sorry about the demise of newsprint because what's replaced it is of significantly lesser value.
In principle I've no objection to paying for news but I will not do so for junk and ads that I cannot avoid (with magazines and newspapers ads are far less intrusive).
So what's the solution? It's difficult but I reckon there are a few worth considering. For example, I mentioned some while ago on HN that making micro payments to websites ought to be MUCH easier than it is now (this would apply to all websites and would also be a huge boon for open source developers).
What I had in mind was an anonymous "credit" card system with no strings attached. Go to your local supermarket, kiosk or whatever and purchase a scratchy card with a unique number to say the value of $50 for cash and use that card to make very small payments to websites. Just enter the card's number and the transaction is done (only enter one's details if purchasing something that has to be delivered).
That way both the card and user remain anonymous if the user wishes, also one's privacy is preserved, etc. It could be implemented by blockchain or such.
The technical issues are simple but problems are obvious—and they're all political. Governments would go berserk and cry money laundering, tax evasion, criminal activity, etc., and the middlemen such as Master and Visa cards would scream to high heaven that their monopolies were being undercut.
In short, my proposal is essentially parallels what now exits with cash—I go to a supermarket and pay cash for groceries, the store doesn't need to know who I am. It ought to be no big deal but it isn't.
It seems to me a very simple micro payments system without name, rank and serial number attached would solve many of the internet payment problems.
Sure, there'll always be hardline scavengers and scrapers but many people would be only too happy to pay a little amount for a service they wanted, especially so when they knew the money was going into producing better products.
For example, I'd dearly love to be able to say purchase a copy of LibreOffice for $10 - $20 and know there was enough money in the organisation to develop the product to be fully on par with MSO.
Trouble is when buying stuff on the internet there's a minimum barrier to overcome and it's too high for most people when it comes to making micro payments (especially when the numbers could run into the hundreds per week).
I cannot understand why those who'd benefit from such a scheme haven't at least attempted to push the matter.
Oh, and that's just one aspect of the problem.
Something about these JS-heavy sites I haven't seen discussed: They don't archive well.
Websites that load a big JS bundle, then use that to fetch the actual page content don't get archived properly by The Wayback Machine. That might not be a problem for corporate content, but lots of interesting content has already been lost to time because of this.
Depending on the site, unfortunately that might be interpreted as a feature and not a bug.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.
> Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
No.
"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.
Do what you have to do, but no one at the NYT is losing any sleep over people like us.
"…but no one at the NYT is losing any sleep over people like us."
Likely not, but they are over their lost revenues. The profitability of newspapers and magazines has been slashed to ribbons over the past couple of decades and internet revenues hardly nudge the graphs.
Internet beneficiaries are all new players, Google et al.
1 reply →
> We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
Where do you trust to read the news? Any newsrooms well staffed enough to verify stories (and not just reprint hearsay) seem to have the same issues.
The AP and Reuters are well-staffed and have functional websites. The sites aren’t great (they’ve been afflicted with bloat and advertising along with most outlets, just at a marginally lower rate), but they are at least usable.
I don't understand all these sites with moving parts even with muted soon, like if everything was a collection of GIFs. NYT followed this path and started to insert muted clips preheminently on their page one, very very annoying.
Do you think youtube will continue to make it possible to use alternate clients, or eventually go the way of e.g. Netflix with DRM so you're forced to use their client and watch ads?
YouTube is already actively blocking alternative clients. that's why yt-dlp needs a JavaScript runtime these days: https://github.com/yt-dlp/yt-dlp/wiki/EJS
They are also not averse to using legal means to block them. For example, back when Microsoft shipped Windows Phone, Google refused to make an official YouTube client for it, so Microsoft hacked together its own. Google forced them to remove it from the store: https://www.windowscentral.com/google-microsoft-remove-youtu...
If Google were just starting YouTube today then DRM would likely be enforced through a dedicated app. The trouble for Google is that millions watch YouTube through web browsers many of whom aren't even using a Google account let alone even being subscribers to a particular YouTube page. Viewership would drop dramatically.
Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.
The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.
Big tech will slowly enforce "secure browsing" and "secure OS" in a way that will make it impossible to browse the web without a signed executable approved by them. DRM is just a temporary stopgap.
3 replies →
What does playing fair mean in this context? It would be one thing if you were a paid subscriber complaining that even paying sucks so you left, but it sounds like you’re not.
It is strange to hear these threats about avoiding websites from people who are not subscribers and also definitely using an ad blocker.
News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.
3 replies →
I am a paid subscriber to NYT and have been reading it paper / internet for 30+ years. It is an Enshittification winner in terms tracking and click bait. It doesn't feel like a serious news outlet anymore, feels like Huff Post or similar.
1 reply →
I'd like to answer that in detail but it's impractical to do so here as it'd take pages. As a starter though begin with them not violating users' privacy.
Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.
2 replies →
You're right, it means nothing. But it cuts two ways. These sites are sending me bytes and I choose which bytes I visualize (via an ad blocker). Any expectation the website has about how I consume the content has no meaning and it's entirely their problem.
The NYT is comically bad. Most of their (paywalled) articles include the full text in a JSON blob, and that text is typically 2-4% of the HTML. Most of the other 96-98% is ads and tracking. If you allow those to do their thing, you're looking at probably two orders of magnitude more overhead.
My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.
The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.
The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.
The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
> The fundamental problem of journalism is that the economics no longer works out.
Yes it does, from nytimes actual earning release for Q 2025:
1. The Company added approximately 450,000 net digital-only subscribers compared with the end of the third quarter of 2025, bringing the total number of subscribers to 12.78 million.
2. Total digital-only average revenue per user (“ARPU”) increased 0.7 percent year-over-year to $9.72
2025 subscription revenue was 1.950 billion dollars. Advertising was 565 million that includes 155 million dollars worth of print advertising.
Sure operating profit is only 550 million very close to the advertising revenue, but the bulk of their income is subscriptions, they could make it work if they had to. My suspicion is that if they dropped all the google ads they could have better subscription retention and conversion rates as well.
I remember getting punishment from parents for downloading 120MB World of Tanks update over metered home internet. Our monthly quota was 250MB. It was not that long ago, 2010.
2010 was a couple years after YouTube enabled 1080p uploads. 250MB a month was insanely small in 2010.
1 reply →
In the late '90s, a friend recommended I download some freeware from a website. It was 1.2MB. I told him "are you crazy? 1.2MB? It's gonna take a whole week!"
A lot of free government phone plans supplied to homeless, parolees etc in the USA only come with 3GB of transfer credit, which is usually burned up in about 3 days, leaving them without any Internet access. (or sometimes it'll drop to a throttled connection that is so slow that it can never even load Google Maps)
I just loaded the nytimes.com page as an experiment. The volume of tracking pixels and other ad non-sense is truly horrifying.
But at least in terms of the headline metric of bandwidth, it's somewhat less horrifying. With my ad-blocker off, Firefox showed 44.47mb transferred. Of that 36.30mb was mp4 videos. These videos were journalistic in nature (they were not ads).
So, yes in general, this is like the Hindenburg of web pages. But I still think it's worth noting that 80% of that headline bandwidth is videos, which is just part of the site's content. One could argue that it is too video heavy, but that's an editorial issue, not an engineering issue.
Why are we supposed to think it's normal to see videos on every page? Even where it's directly relevant to the current page, what's the justification in thrusting those 36.30mb on the user before they explicitly click play?
I don’t think you are supposed to think anything.
It’s a news site with a lot of auto-playing video. If you like that kind of content, great. If not, there’s lots of other websites with different mixes of content. I subscribe to the economist which has few videos and they never auto play.
But that’s a question of taste. 5mb of JavaScript and hundreds of tracking assets is not.
Is that with Firefox's built-in tracking prevention disabled?
I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.
While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
I would love for someone more knowledgeable in this space than I to chime in on the economics of this industry
Are the few cents you get from antagonizing users really worth it?
I suspect the answer is simple and that most users don’t give a shit
I think it has to do a lot with when you came of age - I’m in my late 30s, I got my first tech job at 14 as a sys admin for a large school district, and every single developer, admin, etc that I knew was already going on about the free internet. As a result, I’ve never had a tolerance for anything but the most reasonable advertisements
I think that ideology is necessary to care enough and be motivated enough to really get rid of ads, how fucking awful the websites are alone should be enough but for most people it isn’t
Not only are loading times and total network usage ridiculous, sites will continue to violate your privacy via trackers and waste your CPU even when background idling. I've written about these issues a few times in the last few years, so just sharing for those interested:
A comparison of CPU usage for idling popular webpages: https://ericra.com/writing/site_cpu.html
Regarding tracker domains on the New Yorker site: https://ericra.com/writing/tracker_new_yorker.html
Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed. Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled.
It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust
Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system.
Especially important for banking sites and webmail.
Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client.
But the web-dev-hype people told me that JS-heavy SPA’s (and associated designs) were faster and better for the user!
I didn’t bother validating this, but I’m sure they wouldn’t lie or misinterpret!!
4 replies →
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust.
JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).
Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.
Why can't MY browser send some random JS to THEIR website? If it's safe for me to run some stranger's code, should it be safe for strangers to run my code?
Disable not just JavaScript, but also CSS. I'm not kidding. Many websites actually contain all the content in HTML but use CSS to hide and then use JavaScript to show it again.
If scripting wasn't allowed, we'd probably all have a different browser that allowed it - probably wrapped in a Flash wrapper.
There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages.
The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.
Is the page actually done downloading 0.3 seconds after it starts? Or is it just (your Internet speed) / 49MB = 0.3 seconds?
This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times. Give me back the command line terminal like Amadeus, that would be perfect.
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
> Don't get me started on airlines websites
You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:
- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.
- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.
- date picker is randomly in Chinese
- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.
- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?
Almost nobody uses websites in China. You're expected to interact with them via WeChat Mini-app where the experience is generally fine.
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.
Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.
> How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
Or for developers to pad their CV.
Sadly, I think the only answer is some other form of payment than ad clicks. I've no idea what that could be, though.
Modern web dev is ridiculous. Most websites are an ad ridden tracking hellacape. Seeing sites like hn where lines of js are taken seriously is a godsend. Make the web less bloated.
What really blows my mind is how unusable it is. How often do you visit a web site and there’s so much crap overlaid on the page that you can barely see the actual content? Surely that can’t be good for their ability to make money, yet they persist.
I agree with this. Just because a website can eat all my RAM doesn't mean it should...
Relevant and fun read: The Website Obesity Crisis (2015)
https://idlewords.com/talks/website_obesity.htm
Was about to add this too!
I started writing in a Dioxus (rust framework) style. max 1KB of js code. Sending the diff via WebSocket from the rust server, and , what is more important, all code is now on a server, and because of websocket and rust it executes almost same speed as on the client. Back to normal pages sizes. And, of course, virtual scrolling everywhere.
This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
RSS and feed readers still exist! All hope is not lost.
Sure, I use Feedly myself, but RSS is increasingly marginalized. I use to follow blogs, but it's not usable for mainstream media, Reddit, HN, etc etc.
2 replies →
Got to this article/comments section via FreshRSS. Still the greatest way to consume media on the web.
Why lament it? I've been using Inoreader for over a decade after Google Reader went away. And I gladly pay for it year after year.
People should stop lamenting Google Reader and start using RSS. There are numerous threads about it on HN, e.g., https://news.ycombinator.com/item?id=45459233
What on earth do you have to rely on alphabet, an ad company, to read rss for? there are many other options, that are not made by an ad company.
Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.
Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D
Yeah I wonder why gmail was not one of the shut down products /s
Author forgot to mention scroll hijacking on their list. This is one of the worst offenses.
I remember in 2008, when Wizards of the Coast re-launched the official Dungeons & Dragons website to coincide with the announcement of the fourth edition rules. The site was something in the region of 4 MB, plus a 20 MB embedded video file. A huge number of people were refreshing the site to see what the announcement was, and it was completely slammed. Nobody could watch the trailer until they uploaded it to YouTube later.
4 MB was an absurd size for a website in 2008. It's still an absurd size for a website.
> users are greeted by what I call Z-Index Warfare
Nice term!
> Or better yet, inject the newsletter signup as a styled, non-intrusive div between paragraphs 4 and 5. If the user has scrolled that far, they are engaged.
They're engaged with the content! There is no way to make some irrelevant signup "non-intrusive". It's similar to links to unrelated articles - do you want users to actually read the article or jump around reading headlines?
Z-index warfare is so stupid (the practice, not the name). It's not only with media publications, but ecommerce sites as well.
There have been countless businesses that have lost out on my money because I clicked on their ad (good job!), start reading the product info on their website as it loads (that's basically a sale), and then the page finished loading with a barrage of popups for cookie consent, newsletter signup for a discount code, special offers, special sale, spin the wheel for a prize, etc. That's when I close the tab and forget about it.
This rubbish also exists disproportionately for recipe pages/cooking websites as well.
You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.
It's second nature to open all these websites in reader mode for me atp.
Good sites do exist. It's just that they drown.
I remember browsing the web in 1993-1994. It was literally a list of webpages. Yahoo was there, though, so presumably they've fallen farthest?
True, these ad heavy cooking sites also dabble extensively in SEOmaxxing their way to the top.
You want to know why so many people either jump straight to comments or use alternate sources (archive, llms)? Because if you load the actual site, it freaking blows to use the damn thing.
So much hostile user design.
Edit: NPR gets a little shout out for being able to close their annoying pop-ups by clicking anywhere that's not the notification. So it's still crappy that it hijacks the screen, but not awful I guess?
It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
There are plenty of web devs who care about performance and engineering quality. But caring about such things when you work on something like a news site is impossible: These sites make their money through user tracking, and it's literally your job to stuff in as many 3rd-party trackers as management tells you to. Any dev who says no on the basis that it'll slow the site down will get fired as quickly as a chef who get a shift job in McDonalds and tries to argue for better cuisine.
it's still engineering, just for different constraints - cost & speed.
It's almost criminal that the article does not mention network-wide DNS blocklists as an obvious solution to this problem. I stop nearly 100% of ads in their tracks using the Hagezi ultimate list, and run uBlock on desktop for cosmetic filtering and YouTube.
I should really run some to tests to figure out how much lighter the load on my link is thanks to the filter.
I also manually added some additional domains (mostly fonts by Google and Adobe) to further reduce load and improve privacy.
It does indeed work pretty well today, but they have already developed ways to circumvent it. For example, serving ads from the same domain as the main page.
"…how much lighter the load on my link is thanks to the filter."
Not done any rigorous tests but my experience has been rhat it can be lower than a tenth.
It would be less hypocritical if that critique of the situation wasn't posted on a website that itself loads unnecessary 3rd party resources (e.g. cloudflare insights).
Luckily I use a proper content blocker (uBlock Origin in hard mode).
Even with ad blocking, it's transferring over 200KB of data, half of which is to load a couple of fonts. Not terrible but the basic HTML is only 17KB.
One of the things I don't get is the economics of these trackers.
Someone is serving this amount of data to every visitor. Even if you want to track the user as much as possible, wouldn't it make sense to figure out how to do that with the least amount of data transfer possible as that would dramatically reduce your operating cost?
Perhaps size optimization is the next frontier for these trackers.
Traffic for the static payload is super cheap. And the insane amount of requests is handled easily by modern event-based architectures. The operation costs are most likely only a tiny amount of the overall economics of the tracker's buisness model. The generated tracking data is certainly worth an order of magnitude more then it takes to generate it.
> I don't know where this fascination with getting everyone to download your app comes from.
Apps don't have adblockers.
Most in-app Ads can be blocked with simply changing your dns.
And most people don't even use adblocker when browsing normal site. I kind of had to tech my surrounding people to use adblocker.
Isn't that a much higher bar for most people? I don't even know how to change the dns on my phone, and I am not a typical user, I am a dev.
But running a browser that blocks ads like duckduckgos is super simple.
But yeah I am sure there are additional tracking and possibly retention benefits.
The layout of news sites peaked with cnn.com back in the 1998 to 2002 timeframe. It's been downhill ever since.
When working at the BBC in the late 90s, the ops team would start growling at you if a site's home page was over 70kb...
The newest thing is “please wait a minute while Cloudflare decides you’re not a bot.” So you sit through that, then deal with the GPDR banner, then you get to watch an ad.
I was thinking about creating charts of shame for this across some sites. Is there some browser extension that categorizes the data sources and requests like in a pie chart or table? Tracking, ad media, first party site content...? Would be nice with a piled bar chart with piles colorized by data category.
Maybe you'd need one chart for request counts (to make tracking stand out more) and another for amount of transferred data.
Recently I've read a survey that claimed that one of the websites they reviewed shipped 50 mg of CSS.
I started on this when project when I was at The New Yorker. I had just manage to convince people to give us space to do web performance optimization - and then we had to drop it quickly to work on AMP. Very frustrating.
This site was created to give developers and pms some ammunition to work on improving load speed
https://webperf.xyz/
The leader https://nautil.us on your board is incredibly fast!
meanwhile everyone tells me i have to shave every KB off my web app
Here's something I wrote in 2021:
> Today, there's ~30 times more js than html on homepages of websites (from a list of websites from 5 years ago).
It seems that this number only go up.
I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.
If people tune out only because how horrible the sites are, good.
I think that unless one is well-connected or is willing to pay significant money, time, and/or effort to obtain high-quality information, one will still generally get more accurate information about the world by reading between the lines of the propaganda than one would by not reading the propaganda at all.
rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAM
they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
This is about to get substantially worse as companies introduce more AI into their workflows.
I'm thinking that I gonna start making all my webpages <1mb in size, and compensate that with adding windows 95 to each page load.
Even enterprise COTS products can have some of these issues. We have an on-premise Atlassian suite, and Jira pages sometimes have upwards of 30MB total payloads for loading a simple user story page — and keep in mind there is no ad-tech or other nonsense going on here, it’s just pure page content.
>I don't know where this fascination with getting everyone to download your app comes from.
So they could do exactly what they are doing on the web and may be even more but with Native code so it feels much faster.
I got to the point and wonder why cant all the tracking companies and ad network just all share and use the same library.
But on Web page bloat. Let's not forget Apps are insanely large as well. 300 - 700MB for Banking, Traveling or other Shopping App. Even if you cut 100MB on L10n they are still large just because of again tracking and other things.
In the same vein, https://512kb.club/ is a user-submitted website that features content under 512 KB in size! (blogs, news, etc.)
The article says "I don't know where this fascination with getting everyone to download your app comes from."
The answer is really simple and follows on from this article; the purpose of the app is even more privacy violation and tracking.
Yes, it's 100% horrible. For me the solution is simple. If I click a link and the page is covered in ads and popup videos I CLOSE THE PAGE!!!!
Vote with your behavoir. Stop going to these sites!
I was really surprised when I went to book a flight on Frontier (don't judge me!) and a request from analytics.tiktok.com loaded. I have a lot of discomfort about that. Bloat and surveillance go hand in hand.
Oh yeah, that old topic. We’ve already discussed this back when text-heavy websites started reaching megabyte sizes. So I’m going to go look for the posts in this thread that try to explain and defend that. I’m especially looking forward to the discussions about whether ad blocking is theft or morally reprehensible. If those are still around.
This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.
And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
This is why I have a pi-hole and a selection of addons for my browser.
I am considering moving to technitium though, it seems better featured.
Let's play a fun prediction: I ask HN readers what will be the page size of NYTimes.com in 10 years? Or 20 years?
Want to bet 100 MB? 1 GB? Is it unthinkable?
20 years ago, a 49 MB home page was unthinkable.
In 10 years - 100MB In 20 years - 20MB
I think there'll continue to be growth in page sizes, but then maybe we'll consider efficiency, or the NYTimes shuts down and the 20MB page will be the liquidators selling the domain. Maybe we don't even use domains by then as everything is on an app.
> "Does anyone even care about how their end-product appears to a user anymore?"
Of course not. Its all about maximising shareholder value. The users aren't a consideration anymore
Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
I actually feel offended with a download size this big, it's completely careless from the website owners
An anecdote from an OG (me):
The same phenomenon worsened during the DotCom Meltdown and the Great Financial Crisis. This accelerated desperation is a sign of the times; paying subscribers are likely cancelling due to current economic conditions.
I opened a startup's page the other day, and their streaming demo video was 550mb
Bit unfair, turned off my adblocker and ran NY Times website with cache disabled via Dev Tools, came to 3MB. Still pretty damn high but not 49MB. (Will say I'm in the UK so might be different across the pond).
I'm also in the UK, and it came to 31MB. Then I turned off uBO, Firefox tracking protection, and rejected the cookie notice, and it went over 40MB.
I cannot even imagine browsing the internet or using my devices without Consent-O-Matic and NextDNS.
with almost all options and filters enables ofc
and the NYT web team was praised as one of the best in the world some (many?) years ago.
Some of them are good (formerly Richard Harris - Svelte[0]) some of them should stop podcasting.
[0]: https://svelte.dev/
previously: nytlabs.com https://web.archive.org/web/20191025052129/http://nytlabs.co...
now: https://rd.nytimes.com
"The Sticky Video Player Publishers love embedding auto-playing videos these days, which isn't really popular. You'll find mulitple forum, Reddit, HN, or Twitter threads about it.
To make it somehow worse...when you scroll down, you think it would leave you as it leaves the viewport. No. It detaches, shrinks and pins itself to the bottom right of your screen and continues playing. It keeps the distraction going and as if teasing you, features a microscopic 'X' button with a tiny hit area (violating Fitts's Law)."
Is there not way to stop this? The do not autoplay videos option often does not work.
Blocking the domain of the video player works. For example, primis.tech you can add this to your uBO filter:
Primis is one of them, but there are a few of these companies. I can't remember them all.
> The user must perform visual triage, identify the close icons (which are deliberately given low contrast) and execute side quests just to access the 5KB of text they came for.
They thing is, though... they _don't_ have to. It's been my standard practice for years to just tap ctrl-w the moment any web page pops up a model box. Some leeway is given to cookie dialogs _if_ they have a disagree/disable button _prominently_ visible, otherwise they're ctrl-w'd too.
"Newsletter..." ctrl-w.
"Please disable your..." ctrl-w.
"Subscribe to read..." ctr-w.
Ctrl-w is your friend.
Got to hand it to this guy, this page loads FAST.
I hate this trend of active distraction. Most blogs have a popup asking you to subscribe as soon as you start scrolling.
It’s as if everyone designed their website around the KPI of irritating your visitors and getting them to leave ASAP.
Removing the round navbar in the other pages is unsettling.
49mb web page? How about a 49meg go cli.
Imagine just paying for content
I recently had to switch to a Japanese LINE account to gain access to certain features. I had no idea how good I had it on my American LINE account. The Japanese account is covered with ads EVERYWHERE on the home screen and even in the chat area. I have no idea how this app is still popular in Japan. I would pay to remove the ads if I could.
They also have their own tiktok and AI slop that I never knew about.
49mb web page? Try a 45mb graphql response.
Another reason for this is that often the non-tech people can inject third party scripts via CMS, GTM and so on. I remember once we had a large drop in indexed pages on Google and it turned out that a script had moved our entire site into an iframe. The marketing people who injected it was like "it is just a script".
Ublock origin helps mitigate at the least a little bit here.
I worked at big newspapers as a software engineer. Please do not blame the engineers for this mess. As the article says news is in a predicament because of the ads business model. Subscriptions alone usually cannot cover all costs and ads will invariably make their way in.
For every 1 engineer it seems like there are 5 PMs who need to improve KPIs somehow and thus decide auto playing video will improve metrics. It does. It also makes people hate using your website.
I would constantly try to push back against the bullshit they'd put on the page but no one really cares what a random engineer thinks.
I don't think there's any real way to solve this unless we either get less intrusive ad tech or news gets a better business model. Many sites don't even try with new business models, like local classifieds or local job boards. And good luck getting PMs to listen to an engineer talking about these things.
For now, the bloat remains.
The sad thing is, this is already a paywalled site.
I’m afraid someone who wants to support professional journalism and agrees to pay ~$300/yr for an NYT subscription still gets most (all?) of this nonsense?
It's certainly one of the reasons why I ended my NY Times subscription in 2024, and split that money between recurring donations to public media, Archive.org, and the EFF.
Every time some site or person tries to make me feel bad for using AdGuard DNS, ad blockers etc. I read an article like this and I feel fine.
I see three options:
1. Show me reasonable ads and I will disable ad blocking
2. Do the crap described in this article and don't complain when I arm myself against it
3. Do a hard paywall and no ads; force me to pay to see your content
It's my device. I decide what I download, execute and display on my device. A website is free to offer me to download an ad and I am free to decline that offer. Demanding me to download anything on my device or even worse execute someone else's programs [JS] and claiming that I have a moral obligation to do so is deeply creepy.
[dead]
[dead]
[dead]
[dead]
[dead]
Its a crazy goal , amazing , congrats. Its vanilla stack?
Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.
Well, I'm going to block the ads anyway (or just leave), so if they're trying to find profitable ads, they may need to revise their strategy.
“I’m going to either steal your work in a way you don’t consent to, or not consume it” isn’t really great. The alternative is paywalls
4 replies →
I'm pretty sure people would read more and click on more ads if they didn't have to endure waiting for 49 MB of crap and then navigating a pop-up obstacle course for each article.
100,000 people clicking at $0.01 CPM is way worse for them than 10,000 people clicking at $2 CPM.
If the ad-tech sausage factory needs 49MB of JS for a clickbait article, that is not "earning" a living. They are just externalizing costs to users and ISPs. You can defend the hustle, but the scale of waste here is cartoonish.
If you need a CDN and half a browser runtime just to show 800 words about celebrity nonsense, the business model is broken. Everyone else is footing the bandwidth bill for nonsense they never asked to recieve.
In the case of the New York Times, they have subscriptions and many are willing to pay for their work - but their subscriptions are not ad-free.
This is what killed my willingness to subscribe to most outlets. If I'm paying, I expect the page to load in under a second with zero tracking. Instead you get the same bloated experience minus a banner ad or two.
This argument is valid if journalism was actually journalism instead of just ripping off trending stories from HN and Reddit and rehashing it with sloppy AI and calling it a day and putting in 4 lines of text buried inside 400 ads.
I don't like the state of journalism either but you realize this is a vicious cycle, no? People not paying for news (by buying newspaper, or more importantly paying for classified ads) leading to low quality online reporting leading to people not wanting to pay for online news.
5 replies →
49MB or homelessness? There is surely other options.
If you can think of any, then congratulations! You've saved journalism!
You should probably tell someone so the knowledge doesn't die with you.
3 replies →
Solution, see my post. ;-)
[dead]
> Journalists shouldn't try to make their living by finding profitable ads
I mean, they can absolutely try. That doesn't mean they should succeed.