← Back to context

Comment by DoctorOW

1 year ago

Caddy is great but it doesn't solve the problems of this post. Namely, it is slightly less performant in exchange for ease of use, a tradeoff the author of the original article doesn't seem interested in.

Thats what people love to say but I'm not so confident of that.

Yeah theres the slight Go tax in latency, but almost every comparison online is benchmarking a fairly optimized and often cache configured nginx or apache config versus the most basic caddy config possible. Even worse, most are just testing http1 speeds using near zero-size files, who cares about how many theoretical connections it supports, lets talk how many users it supports on real world content without grinding to a halt. A few more lines of config and a more production intended caddy config is drawing like punches.

Least in my real world testing I found little meaningful improvement using nginx, worse, it would grind to a halt a halt under loads that caddy at least while bogged down, would still be responsive during.

  • Maybe I'm too small in terms of traffic but I've never found Nginx/Caddy to be the bottleneck in scaling, so I have to rely on these benchmarks.

    In my personal opinion, even if there is runtime overhead with Caddy it's worth it. Hardware is cheap relative to what my time is worth to me.