Utility Interest: The Model's Catching On

Brian Davenport, senior VP and CIO of Stewart Mortgage Information, a provider of financial services to banks and mortgage companies, last year decided to outsource most of the company's data-center

April 18, 2005

2 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Brian Davenport, senior VP and CIO of Stewart Mortgage Information, a provider of financial services to banks and mortgage companies, last year decided to outsource most of the company's data-center operations rather than undertake a major upgrade of its infrastructure.

It turned to VeriCenter Inc., which aims to become a true utility-computing vendor by creating a network of regional data-center capacity that's currently structured around flexible monthly contracts but eventually will migrate to a usage-based model. "We wanted to find a way we could provide a high level of service without incurring the extremely high infrastructure costs," Davenport says. Stewart Mortgage implemented its deployment with VeriCenter in October and to date has seen a 15% to 20% reduction in IT expenses, which Davenport expects to further improve.

Twenty-one percent of the respondents to a survey by InterUnity Group and AFCOM say they plan to implement utility computing next year, and 10.6% of those already using the model say they expect to increase its use next year. Vendors with utility-style offerings, such as VeriCenter, Hewlett-Packard, IBM, and Sun Microsystems, are seeing the results of such investments.

Sun in the past year has introduced programs that provide access to its grid-computing network at a rate of $1 per CPU per hour and $1 per gigabyte for storage. Jonathan Schwartz, president and chief operating officer, says Sun is working with more than 10 companies with computation-intensive workloads on proof-of-concept programs that will lead to multithousand-CPU, multiyear contracts. "But to me, what will be more interesting is the long tail, the marketplace for demands of very small increment CPU loads, which eventually will be a bigger market than the large-scale implementations," Schwartz says.

The price point for getting into utility computing today is around 50 cents per CPU per hour, says David Gelardi, VP of deep-computing capacity on demand for IBM, but each engagement must be negotiated in regards to specific computing and software requirements. "You really can't look at capacity on demand in the same way as a utility like water or electricity because it's more sophisticated than that," he says. "We're not there as an industry yet, and I know most clients aren't there yet."Utility computing will be a constantly evolving technology over the next decade, says Steve Prentice, an analyst with research firm Gartner. "What we are going to increasingly see is an infrastructure that's a mix of corporate-owned data and externally purchased services that are blended together in, hopefully, an almost seamless patchwork at the point of delivery."

Illustration by Steve Lyons

Return to the story:
Step Into The FutureContinue to the sidebar:
CPU Cool: Getting Faster But Not More Power-Hungry

Read more about:

2005
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights