Originally Posted by gfretwell
Back in the olden days when there really were data centers they had redundant HVAC systems so you lose a whole system and still have enough HVAC to keep the DP system going. Some even had huge storage tanks on the water chillers so you could lose all the A/C and still have time for an orderly shutdown/transfer to the backup site.
I am not sure we even have that kind of data center anymore. When I was leaving they were turning those "glass house" operations into a small room with few racks that replaced a half acre of raised floor. (most of the reason I left). Some could live in the "office" ambient air.
When the bad card has a red blinking light on it and you can hot swap in a new one without taking down the system you don't need me.
Yes, data centers still do have this, and in fact it's gotten even more obscene for some of those commercial data centers where every outage means millions of dollars lost. 2(N+1) redundant AC, with wells and/or water tanks for the chillers, and emergency ventilation as backup. The emergency ventilation uses little power and is sometimes connected to the UPS, as well- if there is a multiple generator failure, a datacenter running for 2+ hours on UPS power without cooling can get miiighty hot. (sure, the UPS is only designed for 15 minutes, but with 2(N+1) units for redundancy upon redundancy (you wouldn't feed those redundant power supplies from the same N+1 UPS bus, would you?), each with a 15+ minute battery powering a system that probably isn't anywhere close to UPS capacity...

And yes, this is what it takes to go from 99.999% to 99.9999%. It's really really hard to get that extra 9!