← Back to context

Comment by ai-christianson

1 year ago

How many of you all are running bare metal hooked right up to the internet? Is DDoS or any of that actually a super common problem?

I know it happens, but also I've run plenty of servers hooked directly to the internet (with standard *nix security precautions and hosting provider DDoS protection) and haven't had it actually be an issue.

So why run absolutely everything through Cloudflare?

Yes, [D]DoS is a problem. Its not uncommon for a single person with residential fiber to have more bandwidth than your small site hosted on a 1u box or VPS. Either your bandwidth is rate limited and they can denial of service your site or your bandwidth is greater but they can still cause you to go over your allocation and cause massive charges.

In the past you could ban IPs but that's not very useful anymore.

The distributed attacks tend to be AI companies that assume every site has infinite bandwidth and their crawlers tend to run out of different regions.

Even if you aren't dealing with attacks or outages, Cloudflare's caching features can save you a ton of money.

If you haven't used Cloudflare, most sites only need their free tier offering.

It's hard to say no to a free service that provides feature you need.

Source: I went over a decade hosting a site without a CDN before it became too difficult to deal with. Basically I spent 3 days straight banning ips at the hosting company level, tuning various rate limiting web server modules and even scaling the hardware to double the capacity. None of it could keep the site online 100% of the time. Within 30 mins of trying Cloudflare it was working perfectly.

  • > It's hard to say no to a free service that provides feature you need.

    Very true! Though you still see people who are surprised to learn that CF DDOS protection acts as a MITM proxy and can read your traffic plaintext. This is of course by design, to inspect the traffic. But admittedly, CF is not very clear about this in the Admin Panel or docs.

    Places one might expect to learn this, but won't:

    - https://developers.cloudflare.com/dns/manage-dns-records/ref...

    - https://developers.cloudflare.com/fundamentals/concepts/how-...

    - https://imgur.com/a/zGegZ00

  • > not uncommon for a single person with residential fiber to have more bandwidth than your small site hosted on a 1u box or VPS.

    Then self host from your connection at home, don't pay for the VPS :). That's what I've been doing for over a decade now and still never saw a (D)DoS attack

    50 mbps has been enough to host various websites, including one site that allows several gigabytes of file upload unauthenticated for most of the time that I self host. Must say that 100 mbps is nicer though, even if not strictly necessary. Well, more is always nicer but returns really diminish after 100 (in 2025, for my use case). Probably it's different if you host videos, a Tor relay, etc. I'm just talking normal websites

    • > 50 mbps has been enough to host various websites,

      Bandwidth hasn't been a limiting factor for years for me.

      But generating dynamic pages can bring just enough load for it to get painful. Just this week I had to blacklist Meta's ridiculously overactive bot sending me more requests per second than all my real users do in an hour. Meta and ClaudeBot have been causing intermittent overloads for weeks now.

      They now get 403s because I'm done trying to slow them down.

I've been hosting web sites on my own bare metal in colo for more than 25 years. In all that time I've dealt with one DDoS that was big enough to bring everything down, and that was because of a specific person being pissed at another specific person. The attacker did jail time for DDoS activities.

Every other attempt at DDoS has been ineffective, has been form abuse and credential stuffing, has been generally amateurish enough to not take anything down.

I host (web, email, shells) lots of people including kids (young adults) who're learning about the Internet, about security, et cetera, who do dumb things like talk shit on irc. You'd think I'd've had more DDoS attacks than that rather famous one.

So when people assert with confidence that the Internet would fall over if companies like Cloudflare weren't there to "protect" them, I have to wonder how Cloudflare marketed so well that these people believe this BS with no experience. Sure, it could be something else, like someone running Wordpress with a default admin URL left open who makes a huge deal about how they're getting "hacked", but that wouldn't explain all the Cloudflare apologists.

Cloudflare wants to be a monopoly. They've shown they have no care in the world for marginalized people, whether they're people who don't live in a western country or people who simply prefer to not run mainstream OSes and browsers. They protect scammers because they make money from scammers. So why would people want to use them? That's a very good question.

  • Same basic experience. The colo ISP soaks up most actual DDoS. We had a couple mid-sized ones when we were hosting irc.binrev.net from salty b& users. No real effect other than the colo did let us know it was happening and that it was "not a significant amount of DDoS by our standards."

  • I'm sorry but lumping in people who prefer to use a weird browser with "marginalised people" does not help your credibility.

    • What bit do you mean specifically? As a fellow web hoster, who also hosted kids before (from a game making forum), I can fully corroborate what they're saying

      1 reply →

    • You're focusing on the wrong kind of pedantry.

      "Marginalized" has a specific connotation, sure, but people can be marginalized for reasons other than, or in addition to, those that fit the connotation.

> How many of you all are running bare metal hooked right up to the internet?

I do. Many people I know do. In my risk model, DDoS is something purely theoretical. Yes it can happen, but you have to seriously upset someone for it to maybe happen.

  • From my experience, if you tick off the wrong person, the threshold for them starting a DDoS is surprisingly low.

    A while ago, my company was hiring and conducting interviews, and after one candidate was rejected, one of our sites got hit by a DDoS. I wasn't in the room when people were dealing with it, but in the post-incident review, they said "we're 99% sure we know exactly who this came from".

    • What the hell is wrong with people? Honestly the lack of substantive human interaction in a lot of folks' lives, except via the Internet, is a real problem.

      Take that story for instance. Here's how that goes in the physical world, just to show how unbelievably ridiculous it is.

      So you didn't get the job? What's your next step?

      I'll stop by their office and keep people from entering the front doors by running around in front of them. That'll show those bastards.

      1 reply →

I run a Mediawiki instance for an online community on a fairly cheap box (not a ton of traffic) but had a few instances of AI bots like Amazon's crawling a lot of expensive API pages thousands of times an hour (despite robots.txt preventing those). Turned on Cloudflare's bot blocking and 50% of total traffic instantly went away. Even now, blocked bot requests make up 25% of total requests to the site. Without blocking I would have needed to upgrade quite a bit or play a tiring game of whack a mole blocking any new IP ranges for the dozens of bots.

  • AI bots are a huge issue for a lot of sites. Just putting intentional DDoS attacks aside, AI scrapers can frequently tip over a site because many of them don't know how to back off. Google is an exception really, their experience with creating GoogleBot as ensured that they are never a problem.

    Many of the AI scrapers don't identify themselves, they live on AWS, Azure, Alibaba Cloud, and Tencent Cloud, so you can't really block them and rate limiting also have limited effect as they just jump to new IPs. As a site owner, you can't really contact AWS and ask them to terminate their customers service in order for you to recover.

  • How do you feel, knowing that some portion of the 25% “detected bot traffic” are actually people in this comment thread?

I've been running jakstys.lt (and subdomains like git.jakstys.lt) from my closet, a simple residential connection with a small monthly price for a static IP.

The only time I had a problem was when gitea started caching git bundles of my Linux kernel mirror, which bots kept downloading (things like a full targz of every commit since 2005). Server promptly went out of disk space. I fixed gitea settings to not cache those. That was it.

Not ever ddos. Or I (and uptimerobot) did not notice it. :)

Small/medium SaaS. Had ~8 hours of 100k reqs/sec last year when we usually see 100-150 reqs/sec. Moved everything behind a Cloudflare Enterprise setup and ditched AWS Client Access VPN (OpenVPN) for Cloudflare WARP

I've only been here 1.5 years but sounds like we usually see 1 decent sized DDoS a year plus a handful of other "DoS" usually AI crawler extensions or 3rd parties calling too aggressively

There are some extensions/products that create a "personal AI knowledge base" and they'll use the customers login credentials and scrape every link once an hour. Some links are really really resource intensive data or report requests that are very rare in real usage

  • Did you put rate limiting rules on your webserver?

    Why was that not enough to mitigate the DDoS?

    • Not the same poster, but the first "D" in "DDoS" is why rate-limiting doesn't work - attackers these days usually have a _huge_ (tens of thousands) pool of residential ip4 addresses to work with.

      4 replies →

    • We had rate limiting with Istio/Envoy but Envoy was using 4-8x normal memory processing that much traffic and crashing.

      The attacker was using residential proxies and making about 8 requests before cycling to a new IP.

      Challenges work much better since they use cookies or other metadata to establish a client is trusted then let requests pass. This stops bad clients at the first request but you need something more sophisticated than a webserver with basic rate limiting.

      2 replies →

    • That might have been good for preventing someone from spamming your HotScripts guestbook in 2005, but not much else.

It’s free unless you’re rolling in traffic, it’s extremely easy to setup, and CF can handle pretty much all of your infra with tools way better than AWS.

Also you can buy a cheaper ipv6 only VPS and run it thru free CF proxy to allow ipv4 traffic to your site

I also rely on hosting provider DDoS protection and don't use very intrusive protection like Cloudflare.

Only issues I had to deal with are when someone finds some slow endpoint, and manages to overload the server with it, and my go to approach is to optimize it to max <10-20ms response time, while blocking the source of traffic if it keeps being too annoying after optimization.

And this happened like 2-3 times over 20 years of hosting the eshop.

Much better than exposing users to CF or likes of it.

Most (D)DOS attacks are just either UDP floods or SYN floods that iptables will handle without any problem. Sometimes what people think are DDOS is just their application DDOSing themself because they are doing recursive calls to some back-end micro-service.

If it was actually a traffic based DDOS someone still needs to pay for that bandwidth which would be too expansive for most companies anyway - even if it kept your site running.

But you can sell a lot of services to incompetent people.

  • What's the iptables invocation that will let my 10Gbps connection drop a a 100Gbps syn flood while also serving good traffic?

    • The point with a syn flood is to try to saturate the OS limit for open sockets. From an attackers perspective the whole point of a syn flood is to do a DOS without needing much bandwidth.

      My experience form 15 years working in the hosting industry is that volumetric attacks are extremely rare but customers that turn to Cloudflare as a solution are more often than not DDOS-ing them self because of bad configured systems, but their junior developers lack any networking troubleshooting skills.

  • You need an answer to someone buying $10 of booter time and sending a volumetric attack your way. If any of the traffic is even reaching your server, you've already lost, so iptables isn't going to help you because your link is saturated.

    Cloudflare offers protection for free.

I run my "server" [1] straight to my home internet, and maybe I should count my blessings but I haven't had any issues with DDoS in the years I've done this.

I have relatively fast internet, so maybe it's fast enough to absorb a lot of the problems, but I've had good enough luck with some basic Nginx settings and fail2ban.

[1] a small little mini gaming PC running NixOS.

I would feel pretty safe running my own hand-written services against the raw Internet, but if I was to host Wordpress or other large/complicated/legacy codebases I'd start to get worried. Also the CDN aspect is useful - having lived in Australia you like connections that don't have to traverse continents for every request.

It is common once your website hits a certain threshold in popularity.

If you are just a small startup or a blog, you'll probably never see an attack.

Even if you don't host anything offensive you can be targeted by competitors, blackmailed for money, or just randomly selected by a hacker to test the power of their botnet.

Other comments say that DDoS are common, not my experience though. I run a couple of API/SAAS sites and DDoSes are rare. Sites are in Canada and Brazil if that matters, although I won't disclose what data-centers. Most strange thing is that no one demanded any ransom during those DDoS attacks ever. Just some flooding for 1-2 days. Most of the times I did't even care - servers are on 10G ports and I pay 95% percentile for the traffic with a cap on final bill. Sites are geo-fenced by nftables rules, only countries of interest are allowed.

They make it easy to delegate a DNS zone to them and use their API to create records (eg: install external-dns on kubernetes and key it create records automatically for ingresses)

The biggest problems I see with DDoS is metered traffic and availability. The largest Cloud providers all meter their traffic.

The availability part on the other hand is maybe something that's not so business critical for many but for targeted long-term attacks it probably is.

So I think for some websites, especially smaller ones it's totally feasible to not use Cloudflare but involves planning the hosting really carefully.

DDoS is a problem, but for most ordinary problems it's not as bad as people make it out to be. Even something very simple like fail2ban will go a long way.

Web scraping without any kind of sleeping in between requests (usually firing many threads at once), as well as heavy exploit scanning is a near constant for most websites. With AI technology, it's only getting worse, as vendors attempt to bring in content from all over the web without regard for resource usage. Depending on the industry, DDoS can be very common from competitors that aren't afraid to rent out botnets to boost their business and tear down those they compete against.