Virtual's New Reality

They're gaining popularity as a way of consolidating servers and testing software. So why are some companies still holding off on adopting virtual machines?

July 4, 2005

12 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Virtualization is one of computing's oldest tricks, originating in the 1960s as a way of creating multiple "virtual" systems on a big mainframe. Virtual machines--the specialized software that carries the workload--were an efficient way of getting maximum benefit from what was then a sizable investment in a company's data center. Now the approach is being revived on Windows and Linux servers, as cost-conscious IT departments look to maximize computer utilization and extend the life of aging apps. Today's virtual machines have a broader reach than ever before. They're showing up in PCs and storage systems, too, and soon will be able to take advantage of capabilities built into microprocessors.

Gannett will virtualize servers "approaching old age," Kuzmack says.Photo by David Deal

Virtual machines are on the uptake because they make it possible to support different applications and operating systems on a single server, dynamically allocate resources where they're needed most, and reduce server head count in the process. They free companies from having to migrate existing applications every time a new operating system is deployed, extending the life of aging but important applications that represent a significant investment. Think Windows NT-based apps. Virtual machines also can be used to develop and test software before deployment and provide a less-expensive way of backing up computers in emergencies.

No wonder the technology is being applied in everything from server consolidation to the growing number of Windows 2003 upgrades to the increasing requirement for software testing to avoid hacks, worms, and other threats. In some cases, their use can have a ripple effect on how IT staffers are deployed. System administrators who specialize in an application or operating system may find themselves with responsibilities outside their niche as virtualization results in a mixing of technologies across a computing infrastructure.

Not everyone has made the jump yet. For one thing, virtualization software imposes a performance hit, chewing up processor cycles. For another, it adds a layer of complexity, making systems monitoring and management more difficult. And some vendors aren't helping matters by demanding that companies buy full-priced software licenses for each virtual machine on which their software runs.

Yet virtual machines are attracting attention, in part because most servers run at just 10% to 15% of capacity, and virtualization can boost utilization rates to 70% or higher. While just 16% of companies use virtual servers today, according to a Microsoft survey of customers, the market as a whole is gaining traction. The overall virtualization market--networking, storage, servers, processing, and management--had sales of $15.1 billion last year and is growing at an annual rate of more than 20%, market researcher IDC reports.Smelling opportunity, some of the largest vendors have bought into the market. Microsoft acquired the virtual-machine assets of Connectix in 2003 to address its lack of virtual-partitioning software. That same year, EMC Corp. purchased one of the leading lights of the virtualization-software market, VMware, the first vendor able to virtualize the x86 instruction set. Earlier this month, EMC unveiled the VMware Technology Network for developers, which includes a subscription service for products and support, and prebuilt applications from vendors such as BEA Systems, MySQL, Novell, Oracle, and Red Hat so that customers can test them on a virtual machine before deployment. Microsoft is likely to duplicate that move as its own virtualization technology matures. Even chip vendors Advanced Micro Devices Inc. and Intel have put stakes in the ground, with processor enhancements on the way for x86 virtual machines (see story, Chip Shot: AMD And Intel Add Virtualization Hooks).

Virtualization is "pretty incredible technology," says Gregory Veltri, CIO of the Denver Health and Hospital Authority. Veltri watched the health-care organization's server count go from 10 in 1996 to 220 earlier this year, with 90% of the servers running a single application and server-utilization rates averaging less than 20%. Veltri decided to take the 15 physical servers that provided Systems Network Architecture gateways into legacy systems and consolidate them onto two physical machines running 15 virtual servers using Microsoft Virtual Server technology. That freed up rack space and reduced the need for more servers for the first time in years.

"We haven't calculated the return on investment," says the former IBM mainframe programmer. "But intuitively, when you're moving system images around and dynamically allocating resources, there's a savings there."

Still, most IT managers haven't yet made a similar move. It's been easier to deploy and manage cheap commodity servers for each application than to use virtualization to maximize utilization rates. And it typically takes 35% or more of a processor's computing power to sustain one or more virtual machines, a drawback that had many businesses delaying adoption until budgets got tight and servers proliferated.

Improved virtualization software may change that, especially the recent addition of hypervisor technology for x86 systems. A longtime IBM mainframe staple, a hypervisor is a stripped-down operating-system kernel that runs directly on the hardware, resulting in a more-efficient design that takes over rather than intercepts calls to the operating system for processor cycles, data from memory, and network services. It divides available resources among virtual machines based on percentages set by an administrator and reduces the performance hit on the CPU to 5% or less.VMware introduced its x86 hypervisor technology, the ESX Server, in 2001, and Microsoft, with the help of its Connectix assets, is building a hypervisor into the Longhorn version of Windows Server. It will be able to stage any other x86 operating system alongside Windows, making it much easier to run Windows and Linux on the same machine.

A lot of the excitement surrounds Xen, the open-source hypervisor software coming out of the University of Cambridge in the United Kingdom that has garnered enough vendor support to make virtualization a basic infrastructure tool in the data center. The software became so popular among Wall Street trading firms that last year they asked its authors at Cambridge to form a company, XenSource Inc., to support it. Now on its second version, Xen is backed by Hewlett-Packard, IBM, Sun Microsystems, and others that want to standardize virtualization so that they can gear their products to work with it. AMD, IBM, and Intel are building virtualization hooks into their next-generation chips upon which Xen will work. A new version due in August from XenSource is expected to reduce the performance penalty to less than 3%. The fact that Xen is open source, is backed by major vendors, and nearly eliminates the performance penalty "means the Xen initiative will make virtualization much more widespread," predicts Steve McDowell, a strategic marketing manager at chipmaker AMD.

And, it's hoped, easier to manage as more of the major systems-management-software vendors provide support for the Xen initiative. Today, most large-scale systems-management platforms don't recognize virtual machines. Only a few management consoles, such as those from HP, Opsware, and VMware, provide tools for managing virtual machines, Illuminata analyst Jonathan Eunice says. That's a potential problem because a server with multiple virtual machines, each running its own application, might operate at 70% of capacity, and if not carefully managed, the escalating needs of one or more applica- tions could push the server to its limit.

Management tools should help end "server-huggers." Those are the IT pros who say, "That's my server for my application, and that's it," says Nick van der Zweep, director of virtualization and utility computing at HP. In order to break up the one-server, one-application, one-staffer mentality, HP uses its own Shared Application Server Utility to let one IT administrator supervise many applications, each in its own virtual machine.

Virtualization also will change the way technology assets are deployed. Departments that believe they require their own server in the data center for an application may find their app doubled up with another department's. And instead of two IT administrators, one for each department, virtualization also may facilitate IT staff cuts.But for the market to fully embrace virtualization, the licensing question needs to be sorted out. Back at Denver Health and Hospital Authority, Veltri still pays Microsoft for 15 Windows Server licenses, even though the 15 SNA gateways are running on two servers. Along with Microsoft, Oracle and most other major application vendors take the same tack. "The software industry is way out of step with virtualization," Gartner analyst John Enck says. Software vendors should encourage customers to generate lots of virtual machines, lower the price of each software license, and make up the difference by selling more licenses. Microsoft's pricing, with its volume strategies, "will fall quickly" once virtual machines are disseminated on the next-generation x86 chips, Enck says. The company already has conceded that a dual-core processor will require only one Windows license or one SQL Server license, Enck says.

That may allow more server consolidation down the road. Media company Gannett Co. has several servers that are "approaching old age" and will become virtualized in the next 12 months, says IT architect Eric Kuzmack. Gannett uses VMware's Virtual Center and VMotion at its Silver Spring, Md., data center. "We've been using virtualization for two years ... to consolidate old and underutilized servers and in the development and testing environment," Kuzmack says. "We expect over time to see fewer and fewer workloads that require separate physical servers."

Virtualization is "pretty incredible technology," says Veltri, Denver Health and Hospital Authority's CIO.Photo by Greg Friedler

VMotion lets Kuzmack move a virtual machine from one server to another, which he says is "very helpful." With one virtual machine already running, a new one is started, then VMotion allows one server to failover to the other. "There's no need for extra staff to come in at 3 a.m. and move applications to a new server," he says. "Anywhere we have virtual machines, we would use VMotion to move them."

Virtualization itself "doesn't change everything. Rather, it provides the flexibility to change everything," Illuminata's Eunice says.

A lot of the action in the virtualization market is taking place among software developers, who use virtual machines to test code before deploying it on multiple operating systems. All of that can be done on a single computer in separate virtual-machine environments.Instead of buying 80 servers for software development and deployment, Barry Naber, assistant director of enterprise operations at International Truck and Engine Corp., used VMware's ESX Server to partition 10 servers into 80 virtual machines for application development and 70 for production. Developers divvy up the virtual machines between writing and testing code. Over three years, Naber says, this approach saved $1.1 million in hardware purchases. Server utilization has gone from less than 20% to 70% or, in some cases, 80%.

Virtual machines also can simplify disaster recovery. Instead of paying for a duplicate data center, virtualization allows application environments to be re-created on fewer off-site servers. That was Fidelity Information Services' approach. Using VMware, "virtualization has simplified the job of setting up the [disaster-recovery] environment we need," says Paul Little, configuration manager at the unit of Fidelity National Financial, a title- and mortgage-processing company.

The desktop benefits from virtualization, too. Software from Softricity Inc. called Softgrid can "sequence" common Windows applications, isolating the parts that are specific to a particular version of Windows. It can then detect which Windows operating system is running on a target PC and download the application with the right operating-system elements. If the PC is running Windows 2000, for example, then the download has the correct Windows 2000 Dynamic Link Libraries in it.

That approach has the benefit of easing the strain of "location-specific" maintenance, says Shane Nicely, assistant VP of information services at Heartland Financial USA Inc. Nicely supports 110 applications for users in eight banks and 55 locations in Iowa, Illinois, Montana, and the upper Midwest. Before Softgrid, he or another IT staffer would travel to each location to deploy a new version of Windows and a new set of applications or call in a local IT support person. With Softgrid, applications that match the operating systems on existing machines can be downloaded and installed in a day instead of taking up a week of staff time at each location.

Saving time, money, and resources while preserving investments and enhancing operations are pretty powerful arguments for virtualization technology. And the business world appears to be ready for it.

Continue to the sidebar:
Chip Shot: AMD And Intel Add Virtualization Hooks

Read more about:

2005
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights