Comment by Torq_boi
5 hours ago
Yes, it's easy to destroy the servers with a lot of dust and/or high humidity. But with filtering and ensuring humidity never exceeds 45% we've had pretty good results.
5 hours ago
Yes, it's easy to destroy the servers with a lot of dust and/or high humidity. But with filtering and ensuring humidity never exceeds 45% we've had pretty good results.
I remember visiting a small data center (about half the size of the Comma one) where shoe covers were required. Apparently they were worried about people’s shoes bringing in dust and other contamination.
It's not a static number as it's also based on ambient air temperature in the form of dew point - 45% RH at low temps can be far more dangerous than 65% RH at warm ambient.
Likewise the impact on server longevity is not a finite boundary but rather "exposure over time" gradient that, if exceeding the "low risk" boundary (>-12'C/10'f dew point or >15'C/59'f dry bulb temp) results in lower MTBF than design. This is defined (and server equipment manufacturers conform and build to) ASHRAE TC 9.9. This mean - if you're running your servers above high risk curve for humidity and temperature, you're shortening the life considerably compared to low risk curve.
Generally, 15% RH is considered suboptimal and can be dangerous near freezing temperatures - in San Diego in January there were several 90%+RH scenarios that would have been dangerous for servers even when mixed down with warm exhaust air - furthermore, the outdoor air at 76'f during that period means you have limited capacity to mix in warm exhaust air (which btw came from that same 99%RH input air) without getting into higher-than-ideal intake temps.
Any dew points above 62.5'f are considered high risk for servers - as are any intake temps exceeding 32'C/90'f. You want to be on the midpoint between those and 16'C/65'f temps & -12'C/10'f dew point to have no impact on server longevity or MTBF rates.
As a recent example:
Lastly, air contaminants - in the form of dust (that can be filtered out) and chemicals (which can't without extensive scrubbing) are probably the most detrimental to server equipment if not properly managed, and require very intentional and frequent filter changes (typically high MERV pleated filters changed on a time or pressure drop signal) to prevent server degradation and equipment risks.
The last consideration is fire suppression - permitted datacenters usually require compliance with separate fire code, such that direct outdoor air exchange without active shutdown and dry suppression is not permitted - this is to prevent a scenario where your equipment catches on fire and a constant supply of fresh oxygen-rich outdoor air turns that into an inferno. Smoke detection systems don't operate well with outdoor-mixed air or any level of airborn particulates.
So - for those reasons - among a few others - open air datacenters are not recommended unless you're doing them at google or meta scale, and in those scenarios you typically have much more extensive systems and purpose-designed hardware in order to operate for the design life of the equipment without issues.