Skip to main content

How Do I Keep My Data Cool?

 

 

 

In 1965, Gordon Moore predicted that computing power would double every two years at the same cost. This model is known as Moore’s Law. In theory, Moore’s Law is the very definition of rapid progress—something greatly valued in the tech world. But as we now know, advancements in one area will introduce needs in another.

 

As computing power has exploded, so too has our demand for it in every corner of commerce and our private lives. The energy expended by server banks is significant and requires focused efforts to keep hardware operating at top capacity—not only so it is fully functional to handle increased demand, but also so it can maintain digital archives. In order to preserve the latter, that means that data must be “kept cool.”

 

Though a large portion of data is now stored in the cloud, that doesn’t preclude the need for physical servers. Whether on or off site, the fact remains that servers are still required to support your data storage needs. While the physical footprint you need may be smaller, the work your storage hardware does is greater—in fact data centers now have the same carbon footprint as the airline industry. This means that with the demand for increased processing comes the need for more power. And more power means more risk of overheating. Such a situation can have adverse effects that could erode customer faith. Also, a loss of service can translate to monetary losses of up to $50,000 per hour for a company.

 

 

Data center cooling is not the same as comfort cooling

The techniques used to maintain comfortable temperatures in indoor environments is not the same as what you need to keep data stacks cool. Legacy cooling systems—computer room air conditioners (CRAC) or computer room air handlers (CRAH) are outmoded and ineffective for today’s hardware.

 

The reason for this is that comfort cooling systems are designed for intermittent cooling, not a good solution when one needs to maintain a consistent standard of cooling. For situations like this, precision cooling—which may make use of varied cooling methods from chilled air to pumped water—is a more effective and reliable option.

 

Cooling best practices

The first standard for data cooling was released in 2004 from American Society of Heating, Refrigerating and Air-Conditioning Engineers’ (ASHRAE). At that time, ASHRAE recommended temperatures of 68-77°F (20-25°C). Most recently, the 2016 report offers updated guidelines that take into account everything from the type of equipment to its age, as hardware may become less efficient over time, emitting greater thermal energy.

Just as there are different methods, there are also corresponding stages to data cooling—including server cooling, space cooling, heat rejection, and fluid conditioning.

New cooling technologies

According to the Building Services Research and Information Association, over the next 5-10 years, traditional computer room air conditioning will be replaced with advanced cooling techniques like immersion cooling and precision cooling—a combination of free cooling, liquid cooling, and chilled water cooling.

 

What type of cooling units do you need?

Currently Microsoft is on the cutting edge of data cooling technology as evidenced by its new underwater data centers off the coast of Scotland. The initiative, called Project Natick, now consists of 40 feet that with 12 racks of 864 servers. If adopted more broadly, centers like Microsoft’s could be easily deployed and serve as a model of sustainable energy usage as they are powered by renewable sources.

 

Until that day, organizations must consider the size and location of their own storage facilities in order to best select the option that works for them.  And options do abound—including rack, row, panel, and containment-based solutions that work best for particular use cases.

 

What size cooling unit do you need?

Though estimated heat outputs, other heat sources, and humidity can be added to calculate thermal loads, calculating your thermal output isn’t always an exact science. To adequately plan for your servers’ cooling needs, multiple factors must be considered—including server heat output most obviously, but also things like room lighting sources, windows, server room inhabitants, and more.

 

Make sure you have a Plan B

Like all best laid plans, there is always the potential for the unexpected to occur. Make sure your IT team is ready by planning ahead for cooling system complications. Often portable air conditioners or other cooling systems can be brought in to temporarily address immediate needs due to system failure so make sure you know where you can find them.


Monitoring conditions is essential

Today’s computing landscape has now fully realized Gordon Moore’s prediction. To keep up, organizations must be proactive about the climate conditions of their server environments. Everything from temperature, humidity, airflow and even door openings can and should be monitored to protect your company’s irreplaceable assets: its data.