<img height="1" width="1" style="display:none;" alt="" src="https://analytics.twitter.com/i/adsct?txn_id=nxxi1&amp;p_id=Twitter&amp;tw_sale_amount=0&amp;tw_order_quantity=0"> <img height="1" width="1" style="display:none;" alt="" src="//t.co/i/adsct?txn_id=nxxi1&amp;p_id=Twitter&amp;tw_sale_amount=0&amp;tw_order_quantity=0">

Do you remember being admonished as a child for holding the refrigerator door open? Or what about air conditioning – we often hear that raising our thermostats slightly in the summer can result in significant savings.

energycon.png

The data centre is no different in terms of energy conservation strategies; we just do things on a much larger scale. 

There was a time when walking into a data centre felt like walking through the frozen food aisle in the grocery store – a great place to beat the summer heat, but also a great waste of energy. These days are mostly behind us now though. Data centre temperatures have been on the rise for some time as IT equipment manufacturers have recognized the need to reduce cooling costs and have certified their gear to operate at ever higher temperatures.

In addition to higher supply air temperatures, there's now a much greater focus on maintaining separation between supply and return air which is required for maximum cooling efficiency. This separation is best achieved through some type of containment system encompassing either the hot or cold sides of the equipment racks.

Whipcord's Hot Aisle Containment

Our colocation facilities utilize hot aisle containment. In this design, two rows of racks are placed back-to-back. Ceiling panels and end-of-row doors complete the containment system, creating a fully enclosed hot aisle. IT equipment pulls cooler air in from the front of the racks and exhausts hot air into the containment area. We use in-row cooling units that sit between racks and take the hot air from the containment area and cool it using chilled glycol before returning the air to the rest of the room, which constitutes the “cold” zone.

Using hot aisle containment and in-row cooling provides much greater efficiency than a traditional raised-floor data centre, with cooling units placed around the perimeter of the room feeding cold air under the floor and up through vented tiles. The racks and in-row coolers form a close-coupled system which has less flow loss, fewer leaks and can react faster to changes in heat load. 

In addition to containment, we also use “dry coolers” to take advantage of the relatively cool Alberta climate. Mechanical chillers use large compressors which require a lot of energy. The dry coolers, on the other hand, are basically radiators and fans that directly cool the glycol with outside air, using much less power.  

Virtualization technology has also helped tremendously in reducing data centre power use. Modern CPUs are extremely powerful, and we’re able to run many virtual machines on a single, power-efficient server. Our multi-tenant private cloud, for example, offers our customers an incredibly power efficient option representing a very small carbon footprint relative to traditional bare metal colocation.

The data centre industry as a whole uses an enormous amount of power. In the United States, it's estimated that data centres account for more than 2% of the nation’s energy usage, equivalent to 6.4 million average households. Although our own power use is much more modest, we nonetheless strive to operate as efficiently as possible and minimize our contribution to global warming.

Whipcord Advantage - Dan2.png

Dan Hamilton, COO

Written by Dan Hamilton, COO

Dan brings more than 20 years of very diverse experience from hardware and software development to critical facility infrastructure, and is responsible for Whipcord’s overall service delivery and product strategy.