Comment by Arnavion

3 days ago

>This dance to get access is just a minor annoyance for me, but I question how it proves I’m not a bot. These steps can be trivially and cheaply automated.

>I think the end result is just an internet resource I need is a little harder to access, and we have to waste a small amount of energy.

No need to mimic the actual challenge process. Just change your user agent to not have "Mozilla" in it; Anubis only serves you the challenge if it has that. For myself I just made a sideloaded browser extension to override the UA header for the handful of websites I visit that use Anubis, including those two kernel.org domains.

(Why do I do it? For most of them I don't enable JS or cookies for so the challenge wouldn't pass anyway. For the ones that I do enable JS or cookies for, various self-hosted gitlab instances, I don't consent to my electricity being used for this any more than if it was mining Monero or something.)

Sadly, touching the user-agent header more or less instantly makes you uniquely identifiable.

Browser fingerprinting works best against people with unique headers. There's probably millions of people using an untouched safari on iPhone. Once you touch your user-agent header, you're likely the only person in the world with that fingerprint.

  • If someone's out to uniquely identify your activity on the internet, your User-Agent string is going to be the least of your problems.

  • UA fingerprinting isn't a problem for me. As I said I only modify the UA for the handful of sites that use Anubis that I visit. I trust those sites enough that them fingerprinting me is unlikely, and won't be a problem even if they did.

  • I'll set mine to "null" if the rest of you will set yours...

    • The string “null” or actually null? I have recently seen a huge amount of bot traffic which has actually no UA and just outright block it. It’s almost entirely (microsoft cloud) Azure script attacks.

      2 replies →

  • If your headers are new every time then it is very difficult to figure out who is who.

  • Yes, but you can take the bet, and win more often than not, that your adversary is most likely not tracking visitor probabilities if you can detect that they aren't using a major fingerprinting provider.

  • I wouldn’t think the intention is to s/Mozilla// but to select another well-known UA string.

    • The string I use in my extension is "anubis is crap". I took it from a different FF extension that had been posted in a /g/ thread about Anubis, which is where I got the idea from in the first place. I don't use other people's extensions if I can help it (because of the obvious risk), but I figured I'd use the same string in my own extension so as to be combined with users of that extension for the sake of user-agent statistics.

      5 replies →

    • The UA will be compared to other data points such as screen resolution, fonts, plugins, etc. which means that you are definitely more identifiable if you change just the UA vs changing your entire browser or operating system.

> (Why do I do it? For most of them I don't enable JS so the challenge wouldn't pass anyway. For the ones that I do enable JS for, various self-hosted gitlab instances, I don't consent to my electricity being used for this any more than if it was mining Monero or something.)

Hm. If your site is "sticky", can it mine Monero or something in the background?

We need a browser warning: "This site is using your computer heavily in a background task. Do you want to stop that?"

  • We need a browser warning: "This site is using your computer heavily in a background task. Do you want to stop that?"

    Doesn't Safari sort of already do that? "This tab is using significant power", or summat? I know I've seen that message, I just don't have a good repro.

    • Edge does, as well. It drops a warning in the middle of the screen, displays the resource-hogging tab, and asks whether you want to force-close the tab or wait.

> Just change your user agent to not have "Mozilla" in it. Anubis only serves you the challenge if you have that.

Won't that break many other things? My understanding was that basically everyone's user-agent string nowadays is packed with a full suite of standard lies.

  • It doesn't break the two kernel.org domains that the article is about, nor any of the others I use. At least not in a way that I noticed.

  • In 2025 I think most of the web has moved on from checking user strings. Your bank might still do it but they won't be running Anubis.

    • Nope, they're on cloudflare so that all my banking traffic can be intercepted by a foreign company I have no relation to. The web is really headed in a great direction :)

I'm interested in your extension. I'm wondering if I could do something similar to force text encoding of pages into Japanese.

  • If your Firefox supports sideloading extensions then making extensions that modify request or response headers is easy.

    All the API is documented in https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web... . My Anubis extension modifies request headers using `browser.webRequest.onBeforeSendHeaders.addListener()` . Your case sounds like modifying response headers which is `browser.webRequest.onHeadersReceived.addListener()` . Either way the API is all documented there, as is the `manifest.json` that you'll need to write to register this JS code as a background script and whatever permissions you need.

    Then zip the manifest and the script together, rename the zip file to "<id_in_manifest>.xpi", place it in the sideloaded extensions directory (depends on distro, eg /usr/lib/firefox/browser/extensions), restart firefox and it should show up. If you need to debug it, you can use the about:debugging#/runtime/this-firefox page to launch a devtools window connected to the background script.

[flagged]

  • >Not only is Anubis a poorly thought out solution from an AI sympathizer [...]

    But the project description describes it as a project to stop AI crawlers?

    > Weighs the soul of incoming HTTP requests to stop AI crawlers

    • Why would a company that wants to stop AI crawlers give talks on LLMs and diffusion models at AI conferences?

      Why would they use AI art for the first Anubis mascot until GitHub users called out the hypocrisy on the issue tracker?

      Why would they use Stable Diffusion art in their blogposts until Mastodon and Bluesky users called them out on it?

      12 replies →