Ready to manage your entire data center in one solution?

Start your test drive here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Free 30 Day Trial - With Your Own Data

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Take DCIM Monitoring for a Test Drive

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Take DCIM for a Spin

Request Your Free Online Demo Today

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Free Full Featured Download

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

See why marquee customers
are moving to the Sunbird
DCIM platform.

Start your test drive here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

See why marquee customers
are moving to the Sunbird
DCIM platform.

Start your test drive here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

DCIM Suite Bundle

 

See why marquee customers
are moving to the Sunbird
DCIM platform.

Request your demo here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Ready to join marquee customers moving to the Sunbird DCIM platform?

Request your quote here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Request Quote

 

Ready to manage your entire data center in one solution?

Start your test drive here

We’re committed to your privacy. Sunbird uses the information you provide us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Data Center Cooling

Noun
|
Sounds like: "da-ta cen-ter cool-ing"

Data center cooling is the system that maintains the ideal operating environment for all of the equipment in a data center. Since data centers have a combined consumption of about three percent of the world’s electricity, there is a significant amount of electricity generating heat. If not managed properly through an effective cooling system, the IT equipment will likely malfunction, incur expensive damages, and cause downtime.

According to ASHRAE, the ideal temperature for server inlets—the air being drawn into the server to cool the interior components—is between 64.4°F and 80.6°F 18 with a relative humidity between 40% and 60%.

How are Data Centers Cooled?

Previously, data centers used a combination of raised floors and computer room air conditioner or handler (CRAC/CRAH) infrastructure to cool the data center. The CRAC/CRAH units pressurized the space below the raised floor and pushed cold air through the tiles and into the server intakes. Once the air was vented out as hot exhaust, it would pass through the CRAC/CRAH for cooling, ultimately designating the CRAC/CRAH unit’s return temperature as the main control point for the entire data center.

This system was inefficient as it lacked control and cold air was simply vented into the server room. Although it may have been effective for low-density deployments with low power requirements, it was greatly ineffective for higher density data floors.

Modern data centers adopted hot aisle and cold aisle containment strategies to physically separate the cool air intake from the hot air expelled from the servers’ exhaust vents. Without mixing the hot and cold air, temperatures remain more consistent, providing data center managers more control over their environment. More specifically, data center environments are maintained by cooling technologies such as:

  • Calibrated vectored cooling (CVC). Designed specifically for blade servers, CVC optimizes the airflow and allows the cooling system to manage heat more effectively. This increases the ratio of circuit boards per server chassis and requires fewer fans.
  • Chilled water system. Typically used in medium/large data centers, chilled water is converted to cool air and is then moved by air handlers (CRAC/CRAH). The water is supplied by a chiller plant located within the facility.
  • Cold aisle/hot aisle containment. Data center containment is a common form of server rack deployment that uses alternating rows of “cold aisles” and “hot aisles.” The cold aisles feature cold air intakes on the front of the racks, while the hot aisles have hot air exhausts on the back of the racks. The hot aisles expel hot air into the air conditioning intakes, which is then cooled and supplied into the cold aisles. Blanking panels fill empty racks to prevent overheating or wasted cold air.
  • Computer room air conditioner (CRAC). CRAC units are one of the most common features of any data center. They are powered by a compressor that draws air across a cooling unit. Although inefficient, CRAC units are relatively inexpensive.
  • Computer room air handler (CRAH). CRAH units are a part of the chilled water plant system within the data center. Chilled water from the plant flows through a cooling coil within the CRAH unit and modulating fans are used to draw air from outside the data center. These units are most efficient when used in locations with colder annual temperatures.
  • Direct-to-chip cooling. This liquid cooling method uses pipes to deliver coolant directly to a cold plate within a motherboard’s processors to distribute heat. The extracted heat is passed through a chilled-water loop and brought to the chilled water plant. This is one of the most effective cooling technologies as the processors are cooled directly.
  • Evaporative cooling. When water is evaporated after being exposed to hot air, heat is drawn out of the air. This water is often obtained by a misting system or a wet material such as a mat, ultimately omitting the use of CRAC/CRAH units. Although this is an energy efficient technology, it requires a lot of water as well as data center cooling towers to help facilitate evaporation.
  • Free cooling. In certain climates, free cooling brings cold air from the external environment into the servers instead of continuously cooling the same air. This is also a very energy-efficient form of server cooling.
  • Immersion system. This relatively new liquid cooling technique involves submerging hardware into a bath of non-conductive, non-flammable dielectric fluid or coolant.
  • Raised floor. A raised floor provides space between the data center floor and the building’s concrete slab floor. This space houses water-cooling pipes and allows for increased airflow.

The Optimal Data Center Cooling System

With multiple cooling technologies being used within a data center, there is significant opportunity for errors to occur. These errors include:

  • An inefficient cabinet layout
  • Empty cabinets
  • Empty spaces between equipment
  • Raised floor leaks
  • Leaks around cable openings
  • Multiple air handlers fighting to control humidity

If these mistakes are not resolved, it can be very costly and can also impact the data center’s uptime. A substantial understanding of how data center cooling works, along with what cooling looks like for a center’s specific layout and technical requirements, is imperative to an optimal cooling system. Some identified best practices for ensuring optimal data center cooling are:

  • Using hot aisle/cold aisle design. This design is the most efficient as the separation of cold and hot air results in a cooling savings from 10 to 35 percent. Also, this airflow system allows for slower fan speeds and more frequent use of free cooling from airside or waterside economizers which are very energy-efficient cooling technologies.
  • Implementing containment measures. Managing airflow can be difficult. However, extensive containment measures such as walls and doors can keep cold air in the cold aisles and hot air in the hot aisles. This allows data centers to operate higher rack densities while also reducing energy consumption.
  • Inspect and seal leaks in perimeters, support columns, and cable openings. Water damage is a significant problem that data centers face, and it is the second leading cause of data loss and downtime. Data centers cannot afford this threat, both financially and in terms of operations, therefore frequent and thorough inspections of the liquid cooling systems is imperative. Tools such as fluid and chemical sensing cables, zone controllers, and humidity sensors can help make the identification and repairing process much quicker.
  • Synchronize humidity control points. The use of free cooling, although energy efficient, can allow moisture inside a data center. Too much moisture can lead to condensation, which will ultimately corrode and effect electrical systems. Simply adjusting the climate controls can lead to further issues as it may cause the air to become too dry, causing static electricity to build up. Instead, data center managers should pay close attention to humidity controls to maintain an ideal environment within the server rooms.

How DCIM Software Impacts the Cooling System

The ideal Data Center Infrastructure Management (DCIM) solution optimizes the data center’s energy use, asset utilization and performance, as well as space utilization, which are all important elements of the cooling system. DCIM collects, reports, and trends on data from environmental sensors such as humidity and temperature. This allows managers to obtain a clearer understanding of when to alter temperature set points and simplify how they manage airside economization. DCIM makes it easy to visualize hotspots with thermal time-lapse video, avoid overcooling and wasting energy resources, keep cabinets within ASHRAE guidelines, and increase data center sustainability.

Want see how Sunbird’s world-leading DCIM software makes it easy for you to avoid overcooling and wasting energy? Get your free test drive now!

Related Links

Browse other terms alphabetically:

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

WORD OF THE DAY:

Data Center Migration
A data center migration, or data center move, is any movement of data center assets from one location to another.
Learn even more about this term