Data Center Efficiency Plateaus

A recent survey shows that efficiency is no longer a major priority for data center operators. Meanwhile, public cloud and DCIM software are on the rise.

Kurt Marko

August 15, 2013

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

The latest Uptime Institute Data Center Industry Survey reveals some interesting trends, including a reduced focus on data center efficiency.

According to Uptime's survey, which queried 1,000 data center facilities operators, IT managers and senior executives from around the globe, efficiency as measured by median responses (so we're not talking about behemoths like Amazon, Google or Facebook), has plateaued and is no longer considered an urgent priority. Only half of North American respondents said they considered efficiency to be very important.

Uptime's data shows that initial gains in PUE, the standard metric for data center efficiency, which improved dramatically from 2007 to 2011, were largely the result of easy fixes like properly isolating hot and cold aisles, installing blanking panels in unused rack segments and upgrading old power distribution equipment to more efficient models. Now, however, improvements are much harder and more costly to come by. Thus, most operators consider a 1.65 PUE (the average in this year's survey) good enough, even as the mega colocation centers and cloud operators race to see who can edge closer to the ideal level of 1.0.

An easy fix can be borrowed from every homeowner trying to cut their summer electric bill: just crank up the thermostat. Only 7% of respondents operate data centers at temperatures above 75 degrees, even though ASHRAE, the professional society of HVAC engineers, says 80 degrees is a reasonable upper bound.

Another drag on efficiency is the prevalence of zombie servers, the survey indicated. "According to Uptime Institute’s estimates based on industry experience, around 20% of servers in data centers today are obsolete, outdated or unused," the report said. Uptime estimates that for every 1U zombie unplugged, operators save about $2,500 a year in energy, OS licenses and hardware maintenance.

[Uptime's study also indicated that data centers are becoming the domain of service providers as smaller enterprises increasingly outsource their data center operations. Read Kurt Marko's analysis in "Data Center Study: The Big Get Bigger."]

In addition to data center efficiency trends, the Uptime report highlights three data center technologies that are poised for explosive growth: adoption of public cloud services, data center infrastructure management (DCIM) and prefab modular data centers. While we'd agree on the first two, we have our doubts about modulars.

Public cloud growth is a no-brainer as nearly every survey, including ours, shows that enterprise resistance, fueled by a combination of protectionism, security and performance FUD and immature management software, is rapidly crumbling. Only 20% of the respondents to InformationWeek's State of Cloud Computing Survey have no plans to use a cloud service provider. Uptime finds global cloud adoption still rather low at 28%, but large companies are twice as likely as smaller ones (as defined by the total number of operated servers) to deploy public cloud services.

In contrast, private cloud seems to have hit a brick wall, with deployment actually falling in Uptime's survey. It's either harder than people think or smaller companies figure why bother re-architecting for a private cloud when they can rent a ready-made one at AWS or Rackspace.

According to the survey, 38% of respondents use DCIM software, which Uptime defines as a facility-wide system that catalogs assets, collects usage statistics and records operational status. Using some homegrown spreadsheets and open source monitoring tools doesn't qualify, although we would argue that something like Nagios is a long way from DIY Perl scripting and includes many DCIM features. Uptime's number seems high, but the respondent demographics skew large, with 82% managing more than one site and 42% in the business of data center hosting as a colocation or cloud service provider.

Of course, only large operators can justify the cost of DCIM tools; looking at just the small companies in Uptime's sample, 72% report spending over $100,000 on DCIM tools, while 17% of the largest ones spend $400,000 or more.

I take issue with Uptime's prediction regarding prefab modular data centers. Those semi-truck shipping containers made into a tightly packed computer rooms are a clever idea that's time has come and gone. When first introduced more than five years ago, modulars offered superior energy and space efficiency to conventional facilities, but with some significant downsides. First, you needed to redesign data center facilities to look more like a mobile home park -- with concrete pads and utility drops -- than a self-contained warehouse. Secondly, with such tight quarters, if -- or make that when -- a modular's cooling system ever so much as hiccups, the temperature spike could roast everything within minutes.

According to Uptime's own data, modular adoption is tepid with only 8% of data center operators having deployed and another 8% considering them. The majority of respondents (53%), have no interest. Even among large operators, only 15% have modular deployments.

About the Author

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights