Google.com search now refusing to search for FF esr 128 without JavaScript

3 days ago

It just redirects everything to https://www.google.com/httpservice/retry/enablejs?sei=... I guess this is the inevitable end of the era of the web as a collection of hyperlinked documents and the beginning of the web as an application delivery protocol.

In other browsers with JS disabled google search still works but this computational paywall rollout for Firefox esr is a sign of things to come.

Google is now forcing my to go elsewhere and I pay for gsuite. I may as well move it all over to Proton.

I use noscript and refuse to turn on JavaScript for anything but actual Web Applications. I do not turn it on for general browsing of content to be read or search because the the UX abuse that JavaScript enables.

The majority of exploits in the wild are delivered via drive by JavaScript.

That said I'm all for honest Advertising if the UX is not shit.

In fact I think there should be an HTML5 <ad></ad> tag implmented in the browsers sandbox that supports IAB VAST specs so none the that "VAST MACRO" garbage would need to done via huge JavaScript payloads.

I think that lamenting the end of an era because Google doesn’t offer hyperlinked docs is like lamenting the end of fine dining because Olive Garden doesn’t offer cloth serviettes.

We’re looking in the wrong place if we want an ad company to be the champion of anything but revenue optimization.

Highly recommend Kagi Search as an alternative. The results are generally better than Google's anyways, and don't require JavaScript. It is a paid service, but at this point having reliable/privacy-respecting search is worth it.

Not affiliated with Kagi btw.

  • I see it compares between ads/sponsored vs kagi showcasing just webpages. If I use ublock and block elements, does Kagi has advantage over it? In other words, are the search results superior barring QOL improvement could Kagi possibly bring.

    • I personally find the search results from Kagi to be superior to Google/DDG/etc, beyond just not having ads or sponsored content. Before switching to Kagi, I had started to feel like a lot of the front page results from Google were just sites that had managed to maximize their SEO but not actually have much valuable content. That hasn’t seemed to be the case with Kagi. I generally find that the results from them are a lot more informative.

      Of course that’s highly subjective, so I think it’s worth trying out their free tier to see if the potential improvement in search result quality is something you notice and find valuable enough to spend money on.

Generally works for me with https://www.google.com/search?gbv=1&q=test with JS blocked at domain-level by uBlock Origin.

With the caveat that this used to 100% work, but since a couple months, it indeed occasionally redirects to the “Turn on JavaScript to keep searching” page you mention, https://www.google.com/httpservice/retry/enablejs . I'd say the refusal happens 1 / 20 searches. Said differently, I’d prefix your “refusing” with a “sometimes”.

I haven’t investigated the reason for this sometimes-ness. Would love to find an answer here, or ideas/leads (aside switching to another search engine, yes I do know about them, but sometimes Google remains better). Or maybe the sometimes-ness was just A/B testing, and the full switch is happening and this is now a thing of the past.

EDIT you must have posted precisely at the moment of the end of the A/B test: I did several non-JS searches today at $job, and to confirm what I was writing here I did a test one, successfully. But 30min later, I confirm your observation: 100% blocked.

I think google did this to force AI on us. Does not matter to me, I left google a year or 2 ago.

Makes me wonder about Google and AI, with my tin-foil hat on, I cannot help but think Google/AI searches your cache and cookies looking for info.

  • Google has questionable behavior in its browser[0] and tracking technologies[1] that sound similar to what you describe, but I believe the search itself is behaving normally. It runs slowly because all LLM chatbots use tons of processing power to pore through servers full of data that may or may not be accurate.

    I agree Google is trying to force AI on us, but for a different reason: to demonstrate its value to shareholders.

    [0]: https://www.eff.org/deeplinks/2023/09/how-turn-googles-priva...

    [1]: https://www.schneier.com/blog/archives/2025/01/google-is-all...

  • The way web applications work, there is domain separation of data (be it cache or cookies), so googles "AI" isn't going to be able to read data that it already didn't have access to before.

  • > Makes me wonder about Google and AI, with my tin-foil hat on, I cannot help but think Google/AI searches your cache and cookies looking for info.

    This is nonsense. Any cached data or cookies that Google’s scripts have access to was saved by those same scripts. If any site’s “AI” (not sure what you mean by that) could search through objects cached by other sites, you’d have bigger problems.

What's FF esr 128?

EDIT: Figured it out - https://www.reddit.com/r/firefox/comments/1ca4ii3/what_is_fi... - it's "Firefox Extended Support Release" - https://support.mozilla.org/en-US/kb/firefox-esr-release-cyc...

   It occurs to me this might be of interest to readers in this
   thread.

   A couple years ago I put together a list of sites that render
   well using Lynx and EWW.  Since both browsers don't support
   JavaScript out of the box, maybe this is interesting to people
   here?

   https://ohmeadhbh.github.io/bobcat/

   I noticed the Greycoder site has several sites I should probably
   add.  But if you have a link you think should be included,
   please submit a PR on GitHub.  While I'm not horribly sensitive
   to Github's weirdness after being purchased by Microsoft, I am
   sensitive to people who are sensitive to it.  So if you don't
   want to go onto GitHub to submit a PR, you can find my email
   address at https://github.com/ohmeadhbh, just send me an email.

I guess google is fed up with freeloader piggy backing. Requiring JS is going to break a bunch of LLM crawlers immediately

  • It'll also break for a lot of of people with impaired vision and screen readers. Screen readers can't keep up with the insane development pace of JS and CSS and so people with impaired vision are going to be left behind. It's an accessibility nightmare.

    • Google or most search engines work fine with screen readers with javascript enabled. I think your understanding of how web accessibility works is likely severely outdated. There's just too many websites that use JavaScript that it would be a disservice if web didn't support accessible interface for pages with javascript.

      https://en.m.wikipedia.org/wiki/WAI-ARIA

      That said, as ARIA rule #1 says, it's better to not use javascript, as it's always less error prone. That doesn't mean websites shouldn't use javascript when they have reasons to do so, as long as they correctly follow ARIA.

      1 reply →

    • This is a common myth.

      Screen readers are not a type of web browser. They are software which interacts with other software running on the computer, including web browsers. There is nothing which inherently makes JS or CSS incompatible with screen readers.

      4 replies →

    • My understanding is that people with impaired vision use the regular browser and a layer on top of it, such as VoiceOver. They don't need a special version of website. And screen readers don't need to keep up with JS.

Somewhat apropos... this isn't the first time people have been curious about which websites work with JS turned off:

From five years ago: https://dev.to/ziizium/famous-websites-with-javascript-disab...

And eight years ago: https://www.jakobstoeck.de/2017/websites-which-work-great-wi...

It does seem like JavaScript is required on more sites as time moves forward.

  • "It does seem like Javascript is required on more sites as time moves forward."

    If the statement was "Javascript is used on more sites" then I would agree and it is easy to test for use of Javascript.

    But a statement like "Javascript is required on more sites" is difficult to agree with as I have a very different experience..

    For example, I am now retrieving Google results from the command line without Javascript using a specific UA string. Arguably, that means no Javascript is "required" to retrieve search results. A specific UA string is now required though. Use the wrong UA string and then Javascript is "required" to retrieve the results.

    Rather than focusing on Javascript, a more interesting question might be whether more sites are requiring specific UA strings.

    By default I do not use Javascript (I do not use a graphical web browser) nor do I send a User-Agent header. The overwhelming majority of websites "work" for me with no problems. To me, it does not seem that more sites are requring specific UA strings as time moves forward.

    Google www search is just one website. The www is vast.

I got it to work again with a user agent from Links: `Links (2.29; Linux 6.11.0-13-generic x86_64; GNU C 13.2; text)`.

Is anyone collecting links to interesting sites that work without javascript? This could slowly be turned into a simple search engine that returns only results compatible with Dillo and other small browsers.

Are there any alternatives? On the browser that I use, with javascript disabled,

Ask.com: Does not work; does nothing at all.

Ecosia: Blocks me indirectly by cloudflare.

Startpage: Blocks me explicitly, saying "Your connection has been suspended"

  • FWIW... I was able to get startpage.com to work with javascript turned off. But it doesn't really look that nice in a text browser.

    Maybe they're doing some weird geofencing or dislike your ISP?

Pretty proud of the fact that Kagi not only works without Javascript, but it looks and behaves almost the same. Javascript is used to enhance the UX not create it. It is like 'lite' mode is always on.

Same on Dillo, reloading makes the JS wall go away for now, but probably won't last.

Edit: Reloading doesn't work anymore for me. Unless the sca_esv=xxxxxxxxx param is present it will redirect to the JS wall.

SERP systems are no longer working because of this. Lynx seems to be a solution here.

Maybe try changing the user agent? I can use Google on my Kindle's web browser and that can barely handle Javascript (though the Kindle does do limited execution)

  • I have tried spoofing the user agent. No effect. It seems to be if the browser is new enough then if JS is turned off it blocks you. But if you use a really old browser (~2015 Firefox) that doesn't support modern stuff it still allows non-JS search. I think they must have the server looking at HTTP header or fingerprinting or something. I don't think they could do the redirect based on CSS or HTML5 support without JS being run.

    • I believe they're doing a meta-tag redirect (possibly inside a noscript tag?) in at least some cases. Source: I'm developing a web engine that doesn't have JS support.

      6 replies →

  • Other browsers, just an ad to use one of five other browsers, and incidentally use Javascript. Always use noscript to reduce the attack surface.

    For the dyed-in-the-wool, lynx https://www.google.com, tab and type in test, tab and enter, Now how can I get lynx to remove the ad?

    Startpage search on "Google requires Javascript" replies "Allow JavaScript in your browser - Google AdSense Help" - now isn't that special?

It is not just Firefox esr 128. I can reproduce this with other UA strings. It has been intermittent for me so far. Sometimes I get /hhtpservice/retry/enablejs? and sometimes I get results as usual. I do all searching from the command line. No Javascript.

Some HN commenters suggest DDG or paid search. Funnily enough, DDG is returning a CAPTCHA at the moment.

Fortunately I have many other free www search engine options that are working fine.

In addition there are countless website search engines that all continue to work. No Javascript required.

Disappointing. DuckDuckGo seems to still work w/o JS enabled. But Bing also fails to do anything w/o JS. HN comments still seem to work, thankfully.

  • DuckDuckGo specifically offers JS-free frontends and Bing still works -- it's just their tracking redirect pages that require JS. Thankfully userscripts exist to deobfuscate the tracking URLs to plain ones on search results (as they're slightly more complex than Googles)

    • How do you get Bing to work? I go there, type in a query, hit return and then nothing. The eyeglass icon seems like it could be a search button, but nothing happens when it's clicked. I should mention I'm trying it on Firefox 128.6.0esr.

Stop using Google. Google has literally nothing but SEO spam and malware. Use DuckDuckGo or Kagi.