We're Hiring!
Take the next step in your career and work on diverse technology projects with cross-functional teams.
LEARN MORE
Mountain West Farm Bureau Insurance
office workers empowered by business technology solutions
BLOG
3
8
2016
3.1.2023

Think Inside the Box & Implement Containment for Energy Savings

Last updated:
9.16.2020
3.1.2023
No items found.
data center aisle containment pod

Airflow containment refers to the practice of segregating the aisles of a data center so the hot exhaust air from servers does not mix with incoming cold air, while also more efficiently directing airflow into or out of the data center floor. According to the Uptime Institute’s 2014 Data Center Industry Survey, only 30% of operators have at least ¾ of their data center using some form of containment. Less than half of all survey respondents had at least 50% of their data center heat contained.

That leaves a lot of white space without any form of containment, which is one of the best ways to improve energy efficiency and translates into a more reliable environment as well as direct cost savings.

Things have improved since a few years ago, to be sure. But airflow containment remains a significant upfront investment that data center operations teams might not consider, especially at smaller providers or in-house facilities. However it can show a real ROI.

Green House Data is currently evaluating containment in our Seattle, WA data center, which is inside the Westin Building Exchange. In all likelihood we’ll have it installed on at least one floor this year, and expect to save enough energy to pay for the equipment and installation costs within a couple of years. That seems like a while, but after that period all the energy we save is essentially profit.

 

Why Contain?

ASHRAE has increased the recommended data center floor temperature, but very high temps (over 100 degrees Fahrenheit) can often lead to equipment failure. Overcooling, on the other hand, is very expensive when you’re pumping air conditioned air through tens of thousands of square feet.

In addition to improving cooling efficiency by up to 40%, containment also eliminates “hot spots” in the data center, or at least minimizes them. Hot spots are areas where hot air pools, causing equipment problems. Containment allows economizers or free cooling equipment for more hours per year, while cooling systems can also operate above the dewpoint temperature, reducing the use of humidifiers or dehumidifiers on the floor.

 

What Containment Options Exist?

Aisles can have partial containment or full containment as well as cold aisle or hot aisle containment. A partial containment option might be as simple as adding plastic flaps to the end of your aisles. Even this small step can have an impact on efficiency. With a partial containment solution, airflow can still get around the edges and tops of your aisles, but each aisle has its hot exhaust air facing another hot aisle.

Full containment involves sealing off the entire cabinet with doors at the end and barriers blocking the top of the cabinet to the ceiling. This is a less flexible option, so you’ll need to have a highly designed environment and plan ahead, but it is far more efficient.

Cold aisle containment is less efficient than hot aisle. It seals off the cold aisle, where incoming air conditioned air enters the servers. The rest of the data center floor, which is open, is the temperature of the hot exhaust air. Hot aisle containment is the opposite, with the exhaust heat trapped inside the contained area and pumped from there outside the data center floor.

When evaluating containment, calculate the ratio of cooling capacity from your cooling equipment to the estimated heat load from the full data center floor (be sure to project out to completely filled cabinets, not just your initial deployment). This cooling capacity ratio can be greatly impacted by modifying the air plenums (whether perforated floor tiles or overhead), encouraging neat cabling practices, sealing empty rack spaces, reducing the airflow rate, and increasing temperatures.

Play around with all of these factors while modeling containment to find a comfortable area where you are not overcooling, not pushing your servers too close to a failure point, and operating as efficiently as possible.

Arthur Salazar, Green House Data



Posted by Director of Data Centers & Compliance
Art Salazar

Recent Blog Posts

lunavi logo alternate white and yellow
3.27.2024
03
.
27
.
2024
Utilizing Bicep Parameter Files with ALZ-Bicep

Ready to achieve more efficient Azure Deployments? You can use Bicep parameters instead of JSON which opens new opportunities for deployment. Let Lunavi expert, Joe Thompson, show you how.

Learn more
lunavi logo alternate white and yellow
3.26.2024
03
.
04
.
2024
Anticipating Surges in Cyber Attacks and Bolstering Your InfoSec Defenses in 2024

Learn how to navigate 2024 with the right InfoSec defenses to protect your organization against a rising number of cyber attacks.

Learn more
lunavi logo alternate white and yellow
3.26.2024
01
.
03
.
2024
Microsoft Copilot is Re-Shaping the Innovation Frontier

Microsoft 365 Copilot has been released, and it's changing the way we work. More than OpenAI or ChatGPT, read how Copilot can seamlessly integrate with your workflow.

Learn more