Making The Move To Multicore

Your free ride on the Moore's Law bandwagon is coming to an end. Now how are you going to speed up your apps?

October 17, 2008

10 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Quad-core processors are rapidly becoming standard equipment, not just for servers but for mobile and desktop systems too. Following the launch of its first mobile quad-core processors based on 45nm technology earlier this year, Intel began shipping second-generation quad-core server processors that deliver 25% higher performance than the company's first-gen quad-core servers. And Intel, IBM, AMD, Sun, and others are just getting started: All these companies are testing or have on their drawing boards massively multicore processors with as many as hundreds of processor cores per chip. These will bring the capabilities of supercomputers to everyday devices, to a level bordering on Buck Rogers science fiction-think real-time data mining across teraflops of data; full-body medical scans; artificial intelligence for smarter cars and appliances; and virtual reality for a wide range of modeling, visualization, and physics simulations.

Getting from here to there ain't gonna be easy, though. It will definitely take more than shelling out big bucks for new hardware to get Buck Rogers-level results. First off, there's a good chance IT departments standardizing on new quad-core technology will find that a big chunk of that expensive hardware is sitting idle because their software wasn't built to take advantage of multiple cores.

Don't want to be left out of the race to super-high performance? Then get your apps ready now.Walking On The Sun

For years, IT shops have ridden a heady hardware performance curve as designers for computer chip architectures, from Intel and AMD to Sparc and PowerPC, delivered a series of souped-up chip designs that doubled in density and performance every 18 to 24 months. From the 1980s until a few years ago, transistor size consistently shrank, leading to higher densities and faster clock speeds-from 5 MHz in 1983 to 3 GHz in 2002-and performance gains that didn't require software changes.

But Intel CTO Pat Gelsinger's 2001 statement that continuing down the conventional design path would result in a chip whose surface would be hotter than the sun marked a sea change. Manufacturers gradually stopped pushing higher clock speeds, and since the Pentium 4 era, chip manufacturers have focused on improving density and turned en masse to hyperthreading and multicore.

However, the twist is that these new architectures don't automatically translate into higher application performance, as has been the case in the past.IT departments must adopt a clear and consistent strategy to take advantage of multicore performance, and that means upgrading applications so they can take advantage of powerful hardware. How do you know if your company's mission-critical apps can take advantage of multiple cores? "Test [the app's] performance on a new quad-core with the CPU meter active. If multiple cores aren't lighting up, your app probably can benefit from some rework," says James Staten of Forrester Research. "If your application is already built using the loose coupling and modularity of SOA, it's probably ready for the first wave of this change. But for most enterprise applications, some rework will likely be necessary to realize the full performance potential of multicore."

If your company's legacy in-house-developed applications are neither modular nor multithreaded, you have a few options to outfit them to take advantage of the horsepower of multicore platforms. Rewriting these apps to incorporate parallel and/or concurrent designs is a daunting task that will require a significant investment in time and training, and it's unlikely most IT shops have enough in-house expertise to do parallel distributed computing well enough to reliably generate ROI. If you choose to go this route, several large software vendors offer tools for parallel computing; these range from Intel's Threading Building Blocks to Microsoft's TPL, part of the Parallel Extensions to the .NET framework, to Sun's C, C++, and Fortran compilers, which automatically segment single-threaded code into multithreaded equivalents.

Another option is to adapt your single-threaded apps using platform software from third-party vendors like Rogue Wave Software or RapidMind, which offers a development and runtime platform that "parallelizes" single-threaded applications across multiple processors/cores. RapidMind also offers sets of customizable extensions targeting vertical markets that can be used to quickly build applications for multicore processors and accelerators.

Accelerated Computing Solutions uses RapidMind Financial Extensions to provide high-performance computing, grid, and messaging apps to large Wall Street banks and hedge funds. "Given recent events on Wall Street, there is now an even greater need for stable, high-performance systems that will accurately and quickly calculate risk and prices and immediately detect and respond to opportunities in a high volume, volatile, and unpredictable market," says ACS CEO Larry Cohen. "Portfolio and risk managers are always looking to gain a first-mover advantage with faster technology, which is important in both volatile and BAU-business as usual-times, with innovative approaches to run complex models and algorithms in ways that fully exploit the capabilities of the latest multicore processors and hardware accelerators."

The Air Force Space Command employed Rogue Wave Software's Hydra platform to migrate away from a large mainframe environment to an Intel-based hardware cluster running multicore processors. Rather than starting from scratch with new parallel computing languages, the combined commercial and government team, which included contractor Lockheed Martin and federally funded R&D center MITR, was able to migrate an existing space algorithmic application while using new tools to abstract the complexity of multithreading. The result was significant cost savings."The entire solution, including hardware and software, was a tenth the cost of running it on the old mainframe system for just one year, and the old system struggled to keep up with the load," says a senior U.S. government team member.Wicked Game
One industry that has enthusiastically embraced multicore chips? Computer-game makers. State-of-the-art games like World of Warcraft can exploit eight-core STI Cell processors in the Sony PlayStation 3, three-core Intel Xenon processors in the Xbox 360, and both dual- and quad-core processors in Windows PCs. Likewise, major enterprise applications such as Oracle, WebLogic, DB2, and Apache are increasingly being rearchitected to be multithreaded, enabling them to take full advantage of the large symmmetric multiprocessing servers that dominate their market. High-performance media authoring apps, like Adobe Creative Suite, Avid Media Composer, and Autodesk's AutoCAD, as well as other technical financial modeling and image manipulation software, have likewise been tuned to leverage specialized hardware.

On the desktop side, vendors like Microsoft and Apple are just how making parallel processing on multicore platforms a priority. In March, Craig Mundie, Microsoft's chief research and strategy officer, told Reuters that Redmond is preparing for a parallel computing shift that he expects will be as big as the rise of the personal computer or the Internet. Saying the current use of multicore chips is but the "tip of the iceberg," Mundie promised that Windows 7, the follow-on to Windows Vista, would be capable of exploiting both multicore (eight or fewer cores) and "manycore" (more than eight cores) processing.

Apple is also jumping feet first into writing software for multicore processors. During the launch of the iPhone earlier this year, Steve Jobs told the New York Times that the next generation of the Apple OS will focus not on new features, but will instead solve the problem of writing software for multicore processors with a technology Apple has code named "Grand Central."

"The processor industry is going is to add more and more cores, but nobody knows how to program those things," Jobs said. "I mean, two, yeah; four, not really; eight, forget it."Tips On Moving To MultiMore
Going multicore is one way to boost potential horsepower for your business apps, but doing so requires matching cores to OSes to workloads in your application portfolio, easier said than done.

"Not all processors are created equal," says Doug Sandy, a senior staff technologist with Emerson Network Power. "Processors are designed with a purpose in mind: DSPs are designed to perform signal and image processing tasks, packet processors are best suited for manipulating network headers and routing traffic, and so forth. The bottom line is that processors will perform best on the applications for which they were designed, and might perform quite poorly in other uses." Sandy's concerns are echoed by Richard Kaufmann, distinguished technologist at Hewlett-Packard. "If your application isn't amenable to parallelization, or your users cannot benefit from running multiple copies in parallel for throughput, you won't receive any benefits from multicore processors," Kaufmann says. "In fact, you might find your application runs slower over time."The first, best use for multicore machines: Virtualization. Right now, many conventional data centers are using only 5% to 10% of their server capacity. Through the use of virtualization, usage can climb to nearly 85%, or even 90%, which can allow most IT groups to consolidate a boatload of underused servers. Intel recently conducted a study to quantify the potential benefits in a typical data center of replacing 126 single-core Xeon servers with 17 quad-core servers and found they could deliver the same performance capacity with an 83% reduction in floor space; an 87% reduction in energy cost (approximately $53,000 in savings, depending on utility rates); and full payback on the new servers in less than two years.

There are many other benefits to using virtualization to do server consolidation, including eliminating server sprawl, making more efficient use of server resources, improving server availability, and enhancing disaster recovery, but it's important to realize that virtualization is not a panacea, and there are a number of cases where it won't be appropriate. For example, heavily utilized commercial apps or high-transaction database systems that crunch a lot of numbers aren't good candidates for virtualization since they may not deliver acceptable performance when the virtualization overhead is added. "There are tons of advantages for virtualization, such as capacity planning, dealing with peak loads, migration, and so forth," HP's Kaufmann says. "But you do have to make sure that you're not sacrificing performance for this flexibility."

Another option to consider for reducing server sprawl is application virtualization using tools like VMware's ThinApp 4 or Microsoft's App-V. Application servers typically proliferate within data centers because each time an IT department launches or updates an application, individual project managers will insist on bringing in their own servers, not just for operations, but also for development and/or testing. This is especially the case for mission-critical applications where a project or department head may not want to share development space with others in the organization. By hosting individual VMs on one server with individual memory and network interface cards, you can begin to reduce the number of devices required to run home-grown applications, especially legacy in-house apps that are often built with older coding practices.

You can further reduce the overall number of servers by creating virtual instances for all test, development, and even production servers-which will come in handy when you update older in-house apps with more modular and loosely coupled designs better able to harness the horsepower of newer multicore processors.

Forrester's Staten notes the mish-mash of multicore licensing schemes among software vendors like IBM, Microsoft, and Oracle, but says Microsoft's per-socket pricing model may eventually win out because it's easy to calculate, and the number of sockets in a system won't change as dramatically as the number of cores. "Virtualization adds an extra wrinkle," he says, since IT can use virtualization to create multiple virtual servers on a single core. The most cost-effective route is often to negotiate enterprisewide licenses to avoid getting bogged down in complicated per-core or per-server licensing calculations.The good news for IT shops trying to evolve their single-core application portfolios to the multicore world-and in the process continue their ride on the Moore's Law Express-is that server virtualization and consolidation projects offer considerable cost savings that should make the business case an easier sell, even in the current volatile economy.

Read more about:

2008
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights