Why Should I Choose a Data Center? (Part 2 of 5)

By Randy GoodsonBlog


So, last week we talked about power and how using a collocation data center can be cost effective, easier and more beneficial from an electrical perspective.  This week we are going to tackle the issue of keeping your equipment cool.  In order to keep things consistent, we will return to our three examples of in-house data center environments; downtown Big City, USA, Hill Top Rural College Town, USA and New State-of-the-Art Town, USA.

In our Big City, USA location, let’s say that we are on the 38th floor of a 70-story building. We have decided to build a proper data center for our flourishing downtown business that consumes five full floors.  We have appropriated space where we can install a proper computer room.  We’ve decided to use the highly efficient method of a Hot Aisle In-Row Cooling system.  We have allocated space for the necessary in-row cooling units in our new data center. Now we need to decide what to do with the chillers and pumps.  We could put them inside, but that’s not very wise, because we would be heating the air we are trying to cool.  So, we should put them outside.  Oops…the only place we can put them is on the roof of the building.  If, by some miracle, we get permission to put the heat exchangers on the roof, we need to be able to pump the chilled water to our cooling units that are 32 floors below. At roughly 15 feet per floor, with a direct run, we would have to move the chilled water nearly 500 feet. Even with an extremely efficient closed-loop system, the chilled water is going to warm up considerably before it even reaches the cooling units.  This makes the system much less efficient than its design was ever intended.

I know this may come as a surprise to you, but older buildings were not designed with technology in mind (note the sarcasm).  My guess is that in our National Historic Registry building in Hill Top Rural College Town, USA, we are probably in a room with windows along the outside wall of the building. So, in order to prepare our data center for adequate cooling, we should probably replace the windows; if not completely wall-off the windows.  The older windows will not be very efficient and will lose the cooling that we are trying to achieve.  Then, we need to be able to cool the room.  I am going on the assumption that the building doesn’t have a large enough room to house an efficient In-Row Cooling System effectively.  So, we will need to install a less efficient split system of some sort. And, we need to make sure that the system moves enough air in order to keep the computer room (data center) cool. And, if the building is indeed on the National Historic Registry, we need to comply with their regulations about how the building is to be taken care of.  For instance, the outside of the building may be unique enough that they will not allow external units such as chillers to be placed near the building without being designed in such a way as to make the blend in with the architecture.

If we are in our new, State-of-the-Art building with a data center already installed and designed well, we will still need to maintain the system; even new, technologically advanced, “maintenance-free” cooling systems need to be taken care of.  This requires either training on the part of the IT or Facilities staff, or outsourcing the maintenance (both preventive and corrective) to someone who is trained.  Either way, there is additional expense that needs to be incurred on top of the cost of operating the system.  And, what if your business is growing faster than you had predicted in your business plan?  What if you need to add more technology than you had anticipated?  You will need to ensure adequate cooling capacity for your growth.  At the very least, this is not easy or inexpensive.  And, any points you may have gained with the CFO will be lost.

A data center will have all of this taken care of prior to you even walking through their door. If something breaks, it is their responsibility (and expense), not yours.  If you need more cooling capacity, it is their responsibility (and expense), not yours.  And, once again, you get to move your expenditures over into the OPEX side of the books instead of the large one-time CAPEX with on-going OPEX expenditures. SCORE another vote for the third-party data center!  The CFO may just start liking you again.

Check back next week when I seek the advice of Al Gore on the sensitive subject of Internet Connectivity.