Comment by jrockway
6 years ago
I feel like many of the text-mode browsers have failed to keep up with changing web standards. We were all mad when IE was holding back the Internet, and I'm not sure we should give lynx and w3m a pass because they're geek tools. (Accessibility is an important concern, but web browsers running under a GUI system support screen readers.)
https://www.brow.sh/ is a console-based browser that claims to support modern standards. Perhaps that is what we should be using.
(I am now prepared for 6 comments replying to me saying that anything that can be implemented with HTML from 1999 should be, and a list of search results can be. I guess. If all that stuff works for everyone, why did we invent new stuff? Just because? Or perhaps it wasn't really as amazing as well all remember.)
I'll be the first (EDIT: third) of six.
> If all that stuff works for everyone, why did we invent new stuff? Just because? Or perhaps it wasn't really as amazing as well all remember.
To better track people and push ads. It's really mostly just it. Modern web has very little to do with providing value to the end-user; any utility that's provided is mostly a side effect, and/or a vector to lure people into situations where they can be monetized.
Text browsers aren't holding the web down, they're anchoring it in the port of productivity, even as the winds of commerce desperately try to blow it onto the open seas of exploitation.
Come on, you can't be serious about this.
Creating sophisticated web pages is massively easier than 10 or 20 years ago. Yes, HTML of plain simple text-only pages is still pretty much the same, but most users actually prefer visually fancier content with pictures and colors.
Yes, companies presenting themselvses online profit of more capabilities. And yes, presenting ads is probably easier too. But if you think those changes were just made because of monetary greed, you could say the same about almost any technological advancement, like color photography, or electric cars, because all of these had a commercial side to them too.
I am serious. Yes, it's true it's easier than ever to create sophisticated websites. But it's also true that almost all this sophistication delivers negative value to users - it distracts them, slows them down, and forces them to keep buying faster hardware. It doesn't have to be like this - but it currently is. It's not even just a problem of explicit business incentives - the technical side of the industry has been compromised. The "best practices" in web development and user experience all deoptimize end-user ergonomy and productivity.
44 replies →
> most users actually prefer visually fancier content with pictures and colors.
You're aware that pure HTML and CSS alone can produce visually fancy content with pictures and colors, right? It honestly seems like a lot of web developers are starting to forget this, but it's true, I swear. My personal web site (https://coyotetracks.org/) is minimalist by design, but it has accent colors, custom fonts, Retina-ready images, and that silly little fade in and out when you hover over links, all without any JavaScript whatsoever. Also: turns out it works pretty well on Lynx!
I think JS gets a bit of a bad rap these days and am willing to leap to its defense even though I don't like it much as a language, but a huge chunk of the reason it has a bad rap is because people do bad things with it, by which I mean either malicious things or just unnecessary things. An awful lot of modern web sites could run just fine with far, far fewer scripts than they're deploying.
(And, yes, I can even do web analytics without JavaScript, because there are these things called "server logs" I can analyze with utilities like GoAccess.)
1 reply →
Sure, new technologies have changed things for the better. Sophisticated web pages should be created when sophisticated web pages are necessary.
Sophisticated web pages are not necessary to disseminate text-only context. 90's HTML is perfectly capable of doing that.
I have no problem with loading 30MB of JS libraries into the browser for an application that actually does something. I have a problem with loading 30MB of shit to read 10kB worth of news.
It's actually because we programmers like to recreate things, because there's this itch and wonder about how stuff works. Http hasn't changed that much.
Html did. It was hyper text, and rendering was done for document flow.
Then we could script a bit, and soon after we wanted "webapplications". Now, we lost probably 15 years trying to fit in an application-ui and lifecycle model in a document-flow model.
Html, or rather xml, or rather trees, are a good way to represent a user interface. Unfortunately back then, the only languages available were C++ and Java for any proper work oh yeah, and visual basic!).
Javascript, php, and perl were a godsend in terms of productivity. Just like the 80s home computers and basic. It just worked. Type and run. This is also why bad badsoftware gets popular btw..
Coming back to the post.. Lynx renders HTML how it was intended: as a document.
3 replies →
>Creating sophisticated web pages is massively easier than 10 or 20 years ago.
And still average people aren't taking advantage of this and creating their own websites because despite it being massively easier, the way the web is now has pushed it still beyond the average person's reach. If these 'sophisticated' stacks and technologies weren't the norm and instead the web was more focused on being s place where.the average person can easily put up their own fancy looking simple webpage maybe we wouldn't be so dominated by these massive companies that have become the gatekeepers by providing limited platforms for people to do what did used to be a relatively easy thing even back in the day.
There's a reason these companies fund and push these technology stacks, it gives them huge control over the internet and in the end, they don't really do anything fundamentally different than what good old fashioned HTML and CSS can do. Hell especially the HTML of today.
A lot of todays websites are evil. Just today, I couldn't even mark and copy just simple plain-looking text.
1 reply →
>most users actually prefer visually fancier content with pictures and colors
Are you sure this is the case? Because I think it may be a shiny object trap, where on first view a visually fancy site is great and appealing but in the long run a simple, fast-loading site is preferable.
is it now ? my parents whom were 100% tech illiterate were able to put up a moderately complex website with frontpage in 1996 without touching a single line of code and with a 100% custom look. Would they be able to do the same today with current tech stacks ? Not so sure...
I think he was serious, and I agree with him — and I believe that most users don’t want the majority of what JavaScript-laden pages offer over clean HTML-only pages.
The crux of the problem is that the page in this doesn't need to be sophisticated. It lists result links from top to bottom. Things afforded by JavaScript that may enhance the interface like autoloading the next page of results can be made progressively. In particular, "visually fancier" content doesn't necessarily happen at the expense of accessibility with limited browsers. That's the point of CSS.
That is really the fault of "modern web", that web pages are more "sophisticated" than they really need to be to present the information they contain in a visually pleasing and usable manner. There are so many round-about approaches to the problem that people don't concern themselves with the most straight forward one. I can't say that it's somehow massively easier to create a simple list of links with some excerpt in a way that it doesn't work on a 15 year old browser than it is to create one that doesn't. You really have to go out of your way to break something as simple and fundamental as linking.
> Creating sophisticated web pages is massively easier than 10 or 20 years ago.
Well if talking progress, we can also compare, say, energy efficiency of information transmission, or the number of well maintained Web clients. That doesn't look so good, does it? The question is what problem are we solving, or, in your words, what is "sophistication"? According to some measures we did achieve impressive things. But a lot of us experience heartache, because we think we didn't do all that well.
To say that text browsers are "anchoring" the web to those text only standards would imply that developers are making design decisions based on testing and feedback from text only browsers.
There is no way that the percentage of developers doing that isn't vanishingly small. Like 0.1% or less. I always chuckle when 1 person chimes in on a show hn post to complain that the site doesn't work well in lynx... Ya, I'll get right on that, top priority!
You can make that choice. On the other hand, if you decide on not making the situation right on that, you'll alienate not only accessibility-focused users, but also a very loud minority - some of whom are influential across tech (or tech-adjacent) communities.
Amen; fourth here... don't need js cycles for html query
From another comment digging into the issue.[0]
> Regarding 'L', Lynx sometimes "hides" links if the anchor is around a div. Maybe it is just that simple. IIRC, <a href=...><div>...</div></a> will trigger a similar behaviour.
I'm generally against unnecessary web complexity, but I don't understand how anyone can paint Lynx a hero for randomly ignoring anchor tags.
I embrace progressive enhancement where possible, all of my blogs/sites will load and function without Javascript. I'm not going to serve alternative HTML in a scenario like this. There has to be a give and take towards Lynx supporting objectively valid pure HTML content.
It wouldn't violate any of Lynx's pure-text principles to parse modern HTML correctly.
[0]: https://news.ycombinator.com/item?id=21636159
>Modern web has very little to do with providing value to the end-user
I disagree strongly with this. The web has moved a lot in the direction of developer experience (ES6, modules) and new capabilities (WebSockets, WebRTC, WebAudio, SVG, canvas...). Yes, most of this happened because it's a side-effect of big surveillance capitalism companies wanting make that sweet sweet digital pollen to be even sweeter, but that doesn't make it any less sweet just because it was made in bad faith.
The capabilities are there. The dev experience is there (somewhat; JS ecosystem is a mess, but I guess that's just a side effect of moving very fast). But the capabilities are not used for end-user benefit. Not much, anyway. Yes, I benefit from Netflix, I benefit from Google products (not as much as I would if they didn't keep on worsening the UX every few months) - and such functionality requires some of the new capabilities. But 99% of other sites I visit don't use these capabilities for anything good. A pizza ordering site shouldn't need a React frontend. An e-commerce store shouldn't use WebSockets. Every other site out there should not ask me to enable notifications. But they all do. I blame a mix of CV-driven development, designers showing off, "data-driven" bandwagon, and surveillance capitalism.
7 replies →
> I am now prepared for 6 comments replying to me saying that anything that can be implemented with HTML from 1999 should be, and a list of search results can be.
They would be correct replies!
In addition there is this concept called "graceful degradation", where if the browser has more advanced features, you support them, otherwise you work anyway. It's not like supporting Lynx means you can't have a map in the search results when using Chrome. Certainly not for a company with the resources of Google.
Also they should probably send something like a Lynx-version of Google down to people with a poor internet connection.
They probably are hiring only the best machine learning, cloud-native engineers fresh out of university who have never heard of lynx and now the institution doesn’t even realize it broke support for it.
> We were all mad when IE was holding back the Internet
As I recall it, we weren't mad about IE "holding back the Internet", we were mad about IE encouraging web designers to stick a bunch of dynamic clutter such as ActiveX controls into their webpages. Largely because they created this lock-in where sites only worked well on one browser.
It turns out that JavaScript has been co-opted into being the new ActiveX, and Chrome is the new IE. But since JavaScript is nominally an open standard, and Chrome runs on the big 3 OSes, nobody seems to get mad that alternative browser projects are dying because they can't keep up with all the stuff that needs to be implemented in order to work well with sites that were only tested on Chrome and WebKit.
Amen, amen! The ignominous "best viewed in" which we fought in the Second Browser War is creeping back in - except now it says "this hour's current Google Chrome" instead of "MSIE 6".
TBH, I think it's worse than it was with IE6. Targeting just IE was annoying, but, even so, 20 years ago, I could use BeOS as my primary desktop OS, and Net+ was a decent enough browser. Nowadays, the bar to successfully render a modern JavaScript-heavy website is so high that even Microsoft couldn't successfully maintain their own independent rendering engine, and is shifting over to Google's.
Anyone who has heard of the phrase "embrace, extend, extinguish" should be at least a little bit uncomfortable with this situation. For example, it implies that any potential alternative OS needs to be able to compile Chromium (or, as what can only be seen as a second-class substitute nowadays, Gecko) before it can really be viable. If you're the kind of person who likes a free, open and competitive software landscape, that's a looming threat.
1 reply →
One does not need to time travel to 1999 to accommodate text browsers, current HTML works just fine.
OTOH I don't worry too much, accessibility-enforcing laws will provide plenty of job opportunities for future developers... So yeah, good move, I guess.
> I am now prepared for 6 comments
Not sure that's the case, if so, I'd expect a comment that shows a little deeper understanding of the issues involved.
"HTML from 1999" isn't the issue. Lynx did fine on that... and HTML from 5 years ago, just like a whole host of non-visual or semi-visual user agents.
brow.sh is... OK, I guess, nice to have around in a pinch, but like most other schemes that rely on a headless full-fledged browser in order to work with applications that have become dependent on JS to merely render content it introduces a glorified screen-scraping layer to something that can easily be much simpler, assuming web application developers can be bothered to think about it.
We don't need to freeze the web at 1999, and some applications fit poorly in a non-visual context. But a little bit of reflection on how the merits of progressive enhancement and moving forward without losing the benefits we had at that stage would be nice. The stage where people cared about such things was pretty amazing in terms of the breadth of devices web applications would in fact work on pretty well.
Also, if you're not thinking about pretty plain HTML version of your app, chances are half decent you're missing an opportunity to engineer your application better whether or not you care about UA interop.
But, you know, if you're pretty sure the browser should be "thought" about as nothing more than The VM That Lived™, by all means, carry on.
Thanks for making me aware of brow.sh. What an amazing project.
I use text browsers because they are efficient and fast. Also Static content shouldn't ideally need js because it is a security concern.
AFAIK most of us don't complain when a dynamic website doesn't work. We just use a modern browser.
> I feel like many of the text-mode browsers have failed to keep up with changing web standards
Yeah, web standards such as images and video and audio. Not keeping up with those standards is kind of the point.
How about tables, or frames? Lynx doesn't even support those.
Links does, and Emacspeak with Eww is the best browser for the blind ever.
> brow.sh
Does headless Firefox (what brow.sh is at it's core) even launch if there's no X11 available? Is it that headless?
I haven't used it, but it works in a Docker container without any device mounting or access to the host X server, so I'd assume that it does indeed work without any X available.
I've played with it and it works over a headless server instance through an ssh session, so I think it does not require X11
It does work without an external X11 all right (trick question, actually, there's an Xvfb underneath all the turtles - so X11 actually is required, even though it's all hidden inside the container ;))
w3m gets a pass because javascript is bad. The FSF is right about that. wasm will be even worse. Not technically worse, but for removing another layer of control. DNS over HTTPS is just as bad in that sense.
If you can make something without JS you should. cryptomarketplot.com has an accessible mode FOR CRYPTO!! If crypto sites can, everybody can.
It's funny because "javascript is bad" was common geek sentiment at the turn of the century. Now it is flamebait, or being caught up in the past [itself possibly a bit of a code for ageism]. Seeing this attitude change is one of the most interesting things I've seen in tech nerd circles in the last decade or so.
Can you elaborate on why it's funny? javascript (on it's own) has changed a lot in 20 years and it was originally designed in what, one week due to external time constraints? i think the criticisms are valid if $Your_favorite_language had ~4 days only for language design. From my recollection, there were no frameworks, no tooling, no linters, no jquery. it was a basically a primative language for browsers that had no compiler and didn't do anything useful (back in the 90's). About as exciting as VBA 1.0.
5 replies →