VMware Datacenter Growth Far From Over

Some industry watchers say the virtualization wave is nearly done, but I disagree.

Charles Babcock

January 8, 2014

6 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Some observers say the virtualization wave is nearly over because it's reached the 40%-50% of applications that are agreeable to migration into virtual machines. I disagree. I believe virtualization still has far to go and will bring many changes in 2014.

Virtualization is getting so complicated and deeply entrenched in the datacenter that, in time, the virtualization administrator will find a counterpart in another new role: virtual machine analytics or intelligent virtual system management. Call it what you want, it's the art of using knowledge about running virtualized systems to redesign, reconfigure, and redeploy those systems in a way that utilizes resources more efficiently.

The idea of finding data buried in running systems to help IT do its job better is scarcely new. In 2012, it was evident in reported conversations with PayPal CTO James Barresse and Microsoft's Mike Neil, former manager of virtualization and now Azure cloud manager. It can also be found in a December blog post by acting VMware CTO Paul Strong and in an October InformationWeek commentary from VMware's Bruce Davies and Martin Casado.

[Free and open source virtual machines aren't the threat many think. See VMware Killed By Commoditization? Not So Fast.]

They suggest the new gains in virtualization will come from using big data systems to collect and analyze machine data. That data may be found in server-log file managers like Loggly, Splunk, and Sumo Logic, or VMware's vCenter Log Insight. Data from such tools are being brought into products like VMware's vCenter Operations Manager, a sort of middleware analytics, which collects intelligence from different points of the datacenter and comes up with operational intelligence.

Where will we see improvement in 2014? VMware's acting CTO Strong wrote in his blog post that machine learning must come to the datacenter to help solve automation problems. "We have to use machine learning and big data to infer structure, and good and bad behavior," he wrote, acknowledging that the effort isn't limited to VMware alone.

"One of the things I would expect to see across the industry in general in 2014 is more use of these techniques, and tying these to provisioning engines, to enable more automated, policy driven closed feedback loops, for application service level management," he wrote. The provisioning engines are products like Microsoft Virtual Machine Manager, VMware's vCenter Orchestrator, and the open-source  OpenStack's Nova component.

In addition to server-log files, the learning system must be able to draw on network traffic statistics and data from other devices, including firewalls. Can machine learning help protect against intrusions? Bill Roth, group product manager for VMware's Log Insight, said in an email message that VMware is working on "content packs" or plug-in additions to Log Insight that can collect and understand data from routers, firewalls, intrusion detection systems, and vulnerability scanners. Data from these and other devices will fit into the Log Insight framework for data handling. Work is underway to make such data useful through analytics, with much left to do in 2014.

Another area of change will occur in the rapidly evolving realm of virtualized networking. Virtual networking started with a bang in 2013 as VMware launched its NSX Platform at VMworld. Cisco Systems responded by talking up its Application Centric Infrastructure (ACI) as an alternative. Big Switch Networks, Nuage Networks, Cumulus Networks, and others have all posed alternatives.

Martin Casado

Martin Casado, VMware network architect and contributor to the NSX Platform, said in an interview Monday that virtualized networking will move beyond proof of concept or early stage deployments into production in 2014. It deals with networking complexity, but far from leading to confusion, it will offer greater "trending and troubleshooting visibility" into the network, he predicted.

Individual networks will soon be defined by goals set by the network administrator to govern the network building part of the SDN. In VMware's NSX Platform, that would be Service Composer. It can take the declarative rules, policies, and goals set by the network administrator and use them to construct a network service.

Such a service will follow the principle of "least privileged state," with just enough ports, devices, and access assigned to it to do its job. That reduces the attack surface to outsiders, Casado noted. Policies will be created and automatically enforced that allow users to have access to certain resources and groups, but not to others.

In addition, VMware is learning from early implementers, he continued. "It's being used in ways we never thought of. Customers are doing what-if modeling with it by taking a snapshot of an environment, moving it to a development environment, and then seeing what they can make work," he said. Instead of needing to build the physical network to see if it works, they can first test it in an offline environment, perfect it, and then push it into production.

Somewhat in the manner of vCenter Operations Manager, the software-defined network needs the help of analytics and machine learning, Casado said. In addition to learning from previous network experience, the NSX Platform will reach out to other parts of the infrastructure, such as network flow analysis monitors and firewalls, to find out what those devices know.

With such information in hand, NSX will become "an erector set of virtual components, allowing the system to build networks that have only the capacity you need," Casado said.

The SDN will be another data-driven system, feeding results into vCenter Operations Manager. Operations Manager will use its intelligence to try to impose best-case configuration, capacity management, and performance management. In the end, VMware echoes Cisco's theme of pushing the network to become more "application centric."

Its product suite will aim to enable customers "to accelerate the delivery and consumption of the applications that make their businesses real, that differentiate their businesses, while hiding the complexity of the underlying infrastructure," acting CTO Strong said in his December blog post. As applications align with the business, the underlying infrastructure swings into place to drive virtualization deeper into the datacenter -- last year 40%, this year 50%, as the ball keeps moving down the field toward the 100% goal line.

Charles Babcock is an editor-at-large for InformationWeek, having joined the publication in 2003. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week.

Cloud Connect Summit, March 31–April 1, 2014, offers a two-day program colocated at Interop Las Vegas developed around "10 critical cloud decisions." Cloud Connect Summit zeros in on the most pressing cloud technology, policy, and organizational decisions and debates for the cloud-enabled enterprise. Cloud Connect Summit is geared towards a cross-section of disciplines with a stake in the cloud-enabled enterprise. Register for Cloud Connect Summit today.

About the Author

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights