Inside Linux
The Penguin is taking the IT world by Storm. The open-source OS offers enterprises an alternative to high-priced Unix and Windows systems, and was cited by 75 percent of poll
January 30, 2004
But those accustomed to living in a Microsoft world filled with GUIs and eye candy can be wary of jumping into the domain of true multiuser systems and CLIs (command-line interfaces). To be fair, the same can be said of some Sun Solaris, Hewlett-Packard HP-UX and IBM AIX administrators, who find many of Linux's quirks annoying.
The conundrum is that resistance to change can inhibit your company's ability to migrate to less expensive or more efficient systems of any kind--not just Linux--unless you're willing to replace balky personnel, losing technical expertise and business acumen in the process.
Further, hiring can be problematic: Unix skills are pricey across the board--in 2003, the average base salary for a Unix systems administrator was $96,163, compared with $67,355 for a Windows NT administrator, according to Meta Group.
To ensure you're getting what you're paying for, look for a Linux certification--RHCE (Red Hat Certified Engineer) and LPI (Linux Professional Institute) are examples. But admins holding such certs are scarce; those you can find are pricey.One of Windows' attractions is that while many administrative tasks can be performed via the command line, GUIs often make these chores less onerous. In response, Linux has evolved an extensive set of management GUIs, both Web- and X-Windows-based, that provide the same capabilities. It's possible to administer some Linux distributions without ever touching a command line or digging through a manual to find the location of a configuration file.
But beware: You'll hit a wall. Some applications simply cannot be configured or managed without using the command line. And in many cases, an ISV (independent software vendor)-developed application will preclude user-interface-assisted installation, even if its Windows counterpart offers such a nicety. Expertise in vi, a Unix text editor, is no longer required, but a working knowledge of the command-line environment is still de rigueur for those serious about Linux.Technical-support concerns are also a sticking point. It will ease your troubled minds to know that Hewlett-Packard, IBM and Red Hat, among others, offer 24x7 support for Linux, with Hewlett-Packard and IBM supporting Red Hat and SuSE, the leading contenders for enterprise use. If the need to have help just a phone call away is holding you back, take a gander at our Linux support services Buyer's Guide and our review of distribution-neutral services, "TEAMwork Pays Off for Linux".
As for the perceived lack of data-center application that run on LInux, this is largely a result of ineffective marketing by software vendors. In "Where To Find a Fit", we'll give these vendors a hand by calling out both commercial and open-source data-center offerings.
By the Numbersclick to enlarge |
Back when Windows NT was gearing up in the mid-'90s, analysts predicted it would take hold in the enterprise based on the availability of some 1,200 commercial applications for the fledgling server platform. Compare that with the more than 4,500 commercial applications available for Linux today, according to a recent IBM survey. Not even non-Euclidian math can equate 4,500 to a lack of application availability.
Those who worry about SCO's efforts to charge licensing fees for Linux can find comfort in indemnification programs and defense funds established by Hewlett-Packard, Intel, Novell and the Open Source Development Labs, among others. We think companies shouldn't let SCO's claims deter them (see "Don't Be Chilled by Linux Lawsuits,"). In fact, if we used SCO software, we'd worry that the $9 million Gartner estimates the company will spend pursuing lawsuits against Linux users and vendors--with uncertain returns--could well weaken SCO and jeopardize the future of its own products.
As we went to press, SCO had responded to the Utah U.S. District Court's Dec. 5 order requiring the company to clarify its allegations against IBM, but details of the information contained in the SCO response had not been disclosed (see more on this subject in "Industry Insights").A surprisingly high number of respondents to our survey--36 percent--cited stability/maturity as a reason for not deploying Linux. That gave us a chuckle because it's likely that Linux is already running in many of their infrastructures, albeit undercover. That's because most appliances that perform DNS, firewall, proxy and content-switching tasks run on Linux.This misconception often comes from folks who experimented with Linux several years ago and found its driver support hit-and-miss. Sure, it's rare to find a Linux driver sitting alongside the Windows drivers on the software CD sold with the device.
Chances are you'll need to visit the vendor's Web site. But as Linux has grown in popularity and usefulness as a general-purpose OS, driver support has also increased. Hot-plug USB, SCSI and RAID adapters, video cards and other peripherals and hardware are now supported, not only by open-source developers, but often by the hardware vendors.
Moreover, if you deploy a distribution from a vendor that has partnerships with a Tier 1 hardware maker, such as Dell, Hewlett-Packard or IBM, the issue of hardware drivers becomes nonexistent. Try deploying Slackware and you may run into problems; but even in the more generalized distributions, the question of driver availability for most hardware has become moot.
More troubling than driver support was Linux's inability to correctly probe hardware and determine which driver to use in the first place. This plug-and-play functionality, which Windows has provided for many years, took a bit longer to mature on the Linux platform. Although it's plug-and-play capabilities are by no means perfected, we've reached the point where Linux will identify and install the correct drivers for all Tier 1 hardware vendors--and, quite often, for more obscure ones.
Well-Developed
Worldwide Linux Server Shipmentsclick to enlarge |
The mantra most often invoked by those who buy into the myth of Linux instability is its development. And the most oft-cited portion of this argument against open-source software and, by extension, Linux is the lack of responsibility by a single entity--the old "inability to have defects resolved and to obtain support" saw.
Many open-source software projects are released under the GPL (General Public License) or LGPL (Lesser General Public License) for personal use but require royalties and/or licensing fees for corporate and commercial use, and in return provide technical support and defect resolution. This puts the most popular and commonly used open-source applications--Apache, AXIS, JBoss, Jetty, MySQL, Saxon and Tomcat--on the same playing field as commercial applications. Someone is responsible for the software, and you can get support.
Of course, consider the source. If you download "Tom's Cool Linux GPL Application" from a personal Web site, that's no different from downloading "Mary's Shareware Windows Application" and then attempting to get support for it. We all like free stuff, but as with any application--commercial or open-source--it's your job to make sure you can get the service and support your organization needs.
As with hardware, it's likely that open-source software lives in commercial products deployed within your organization. When a product needs a Web interface for management or user interaction, for example, it often ships with Apache/Tomcat, JBoss or Jetty--all open-source projects that prove not only that open-source development is viable, but that it works well enough for companies like Hewlett-Packard, Intel and IBM to back the OSDL, where open-source innovations are encouraged and released into your waiting hands.A number of enterprises have found success. FedEx Corp. and Google run their entire infrastructures on Linux. Amazon.com, Computer Associates, Disney, DreamWorks, L.L. Bean, Pixar, Merrill Lynch, Morgan Stanley and Smith Barney also rely on Linux for day-to-day operations, while Shaw's Supermarkets and Supervalu are 100 percent Linux today. And let's not forget that the National Oceanic & Atmospheric Administration (NOAA) tracked Hurricane Isabel on Linux-based systems. Life-and-death decisions are made based on information from applications running on Linux.
If you need your own empirical evidence of Linux's stability, we suggest deploying it where redundancy is required--firewalls, DNS, Web servers and file/print sharing.For temporal reference, Linux is now more than 10 years old, only one year younger than Microsoft's first accepted desktop operating system and two years older than Microsoft's server OS. One can argue that Windows was based on DOS, thereby making it more mature, but we'd counter that the operating principles of Linux are based on decades-old Unix.
Feel Safe
Although security was cited by just 21 percent of our survey's respondents as a hindrance to deployment, it's a valid concern. There are, indeed, vulnerabilities, but they've typically had far less of an impact than those affecting Windows machines. That's because the security models of the two operating systems are vastly different: In Linux, it's unlikely that anything a user can install will infect the core operating system. Many of the viruses and exploits successful against Windows are unlikely to have a similar impact on a Linux system, where mail clients and browsers aren't closely integrated with the underlying operating system.
Yet many early attacks against Web servers were perpetrated against sites served by Apache via CGI modules, such as PERL. The permissions and failure to properly validate user input could offer remote access to an attacker or leave behind Trojans for easier access at a later time.
These types of attacks, for the most part, have been avoided over the years, but Linux is not invulnerable to attacks, viruses or malicious code. It simply requires that the attacker be ingenious and have a thorough knowledge of the system he or she is attacking, which can translate into more devastating results if an attack is successful. Bottom line: No operating system or software is invulnerable when it has an active network connection.Organizations that place a premium on security often seek out a hardened Linux distro; see "Hardened Linux Puts Hackers EnGarde".In our poll, just 11 percent said cost was not a factor in deciding to deploy Linux--but only 17 percent said it was the biggest consideration. Most consider price just one facet. In our experience, the most common business argument bandied about for Linux is based on total cost of ownership. The reality, however, is that TCO is not the primary decision-making factor. In almost all large-scale deployments, the overriding consideration has been the more straightforward COA (cost of acquisition), and this trend will likely continue.
Still, that doesn't mean TCO can be ignored. Factors to consider include not only the cost of hardware and software over time, but also the price of administration. A recent Robert Frances Group study, "Total Cost of Ownership for Linux Web Servers in the Enterprise" (see www.rfgonline.com), compared the TCO of Linux, Solaris and Windows and found that a Linux admin, on average, supports four machines for every one managed by his or her Windows counterpart. Patching and updates for IIS versus Apache, for example, accounts for some of this disparity. But even assuming a somewhat higher salary for Linux expertise, administrative savings will add up.
The cost of downtime is also a consideration. The RFG report estimates it as high as $1 million per minute--we wouldn't want to accidentally kick out a plug at one of these companies!--but to ascertain your vulnerability, you must know which applications reside on the servers used to determine cost of downtime. Note also that much of the overhead involved in downtime isn't catastrophic crashes, but routine patching: Almost all the myriad patches, not to mention software installs, for Windows-based systems require reboots, meaning there will be an associated dollar cost. Most patches for Linux, in contrast, don't need a reboot, but require only that the affected process be shut down, patched and restarted, leaving other applications to continue on their merry ways with no interruptions.
Of course, there are cases when a Linux system must be rebooted, or a required patch/upgrade results in application downtime, but this is far less common on Linux than on Windows systems because of the deep integration of almost all Microsoft components throughout its operating system.
TCO calculations also must take into account the cost of software licenses over time. Red Hat's distro was, until the end of 2003, free (as in gratis), but that is no longer true. In fact, pricing of the Red Hat Enterprise Linux line is more closely aligned with Microsoft's model--a subscription-based, per-server, per-year fee. The big difference is that Red Hat doesn't require CALs (client-access licenses), whereas Microsoft makes up for its lower base price by charging for CALs and forcing upgrades to newer versions of its operating system.
IT Minute: Linux in the Enterprise Grab your RealPlayer and get the inside scoop on which applications you'll need to put Linux to work in your organization. |
Although no one we've talked to is pleased with the pricing structure for Red Hat Enterprise Linux (RHEL), Red Hat says it will provide increased ISV support for Linux as well as additional technical support options, and these niceties do not come free (see our review of RHEL). An alternative, of course, is to move to SuSE, which Novell now supports and offers; taking this tack reduces the cost of acquisition to far below that of Solaris or Windows.There's also a large disparity among the hardware requirements for Windows, Linux and other Unix-based operating systems. Many flavors of Unix require beefy, proprietary (read: expensive) boxes, and the minimum hardware cost of entry for Windows 2003 is also substantial. A low-end Intel machine, on the other hand, can run Linux quite happily. In our own NWC Inc. and Green Bay, Wis., Real-World Labs, we have a number of Dell Optiplexes that are too underpowered to run Windows 2003, but they run Linux just fine as servers. It is this repurposing of hardware, the leveraging of existing investments, that makes Linux appealing for redundant, large-scale data-center deployments.
A reason for this resource-needs gap between Linux and Windows is that while you can use the GUI to configure Linux, the GUI does not need to be running after configuration on a Linux installation--meaning that a whole lot of CPU cycles are freed up to complete the tasks a server should be performing, rather than drawing pretty pixels and interrupting tasks to check for GUI events. The GUI can be separated from the operating system, meaning cycles are used only during configuration when necessary, not 24 hours a day. Yes, X Window is a pig, but only when it's running.
So Where Does It Fit?
Linux is most often deployed on the edge of the network, for TCP/IP-based applications such as Web and mail servers, DNS, FTP servers, and proxies/caches. Generally, Linux lives where redundancy is required--in most Web farms, you'll find racks of Intel-based Linux servers running Apache and serving millions of Web surfing clients a day. Indeed, Apache on Linux dominates Web-hosting providers and large-scale search engines, such as Google, proving that this combination is more than capable of providing Web- and Internet-based services for even the most demanding high-availability environments.
The Chosen One Although it still lags on the desktop, Linux got a boost recently when the Israeli governement joined the growing list of organizations that have stopped buying Microsoft Office and started working with IBM and Sun Microsystems to improve OpenOffice. |
Mail-server deployment may sound at first like an anomaly. After all, most organizations require the functionality of a full-featured groupware server; and the choices for such applications on Linux is severely limited, with offerings from only a few vendors, such as Novell (SuSE) and IBM (Domino).
Where Linux finds its niche in the mail-server market is serving as an internal routing department, sorting mail for various subdomains and directing messages to the appropriate groupware server within the organization, and as a proxy of sorts--a frontline filter for more sensitive, less flexible systems, such as Exchange. This works because many Linux systems can be easily extended through configurable rules and filters that take advantage of pattern matching and regular expressions. This isn't true of systems like Exchange, which can be extended only through development of code or installation of third-party products. Sendmail, the best-known Linux-based messaging product, has been deployed for both its filtering capabilities and its scalability, serving such large organizations as Pfizer and Harvard University.
Nearly 37 percent of those responding to our e-mail poll run databases on Linux. Only Web, file and mail servers rank higher in the list (at 51, 39 and 37 percent, respectively). Most relational databases, such as Oracle, DB2 and Sybase, are run only on heavy-duty Unix operating systems, such as Sun Solaris and IBM AIX. The cores of these databases are optimized to run in Unix environments, and Linux fits that bill perfectly. Indeed, Oracle's latest relational database, 10G, was introduced and marketed with a focus on Linux, not Solaris. Microsoft Windows wasn't even mentioned as an option.
When the Ellis Island Foundation had problems with the performance and availability of its site, it turned to Linux. Traffic, already numbering millions of hits per month, increased by an estimated 40,000 hits per hour, thanks to a link letting thousands of genealogy buffs search the foundation's more than 25 million records, including photographs and ship manifests. The solution was an Oracle9i RAC (Real Application Clusters) Red Hat Linux system running on Hewlett-Packard hardware, deployed with no additional staff requirements. The system in vastly improved availability, with search times cut from 15 seconds to 5 seconds.Architectures requiring high availability and redundancy will show the highest ROI when deployed on Linux. Clustering has been, and still is, the biggest advantage of Linux over its competitors in terms of price-performance. For an example of a clustered deployment, see "Linux on the Inside Track". If you don't want the fun (or hassle, depending on your point of view) of building your own Beowulf, there are plenty of commercial products that will assist you in rolling your own cluster for myriad purposes--Web farms, image and document processing, distributed computing and, more recently, grid computing. Indeed, Linux is likely to dominate clustering within two to three years because of price-performance advantages in the 15- to 20-times range, according to Aberdeen Group.
Linux also excels as a file-and-print server. Among its capabilities for meeting disparate needs are support for AppleTalk, Samba (Microsoft CIFS), Novell NetWare and NFS. In the print-server arena, Hewlett-Packard continues to support an open-source project to further the development of HP printer drivers for Linux. It's rare to find a printer not supported on Linux, and if the printer is an HP model, it's almost always supported natively under Linux with all requisite functionality. USB, parallel, JetDirect, IPP (Internet Printing Protocol) and Samba printing are handily supported by CUPS (Common Unix Printing System), which is becoming the de facto standard in Linux for easy implementation of print sharing across networks. Although the free version of CUPS may not be for everyone, the commercial version sports drivers for just about any printer you can think of, and plays well in heterogeneous environments.We couldn't very well talk about Linux without mentioning its use on the desktop. The biggest barrier to Linux's success there is still availability of applications, both custom and off-the-shelf.
Your ability to migrate to a Linux desktop will be directly proportional to the number of custom Windows applications your organization needs. Porting a lot of applications to the new platform will make the cost of migrating to Linux financially unfeasible.
In addition, the split between the KDE and Gnome camps has long been problematic. It has hampered acceptance of Linux on the desktop because an application written for Gnome won't always work when run under the KDE. True, Linux distribution vendors have begun to address this problem; but for their solution to work, the libraries needed to run applications for both must be installed, increasing the footprint of a desktop install.
Support for off-the-shelf desktop applications is growing, but it has a way to go before it catches up to the Windows and even Mac worlds. Although OpenOffice and its predecessor, StarOffice, provide sufficient compatibility with Microsoft Office products, some of the functionality that migrating Windows users expect--including deep integration with mail clients and browsers--is missing. Expect an adjustment period and some (OK, many) calls to tech support.Other application support is missing entirely. The tools Linux desktop users crave most often are replacements for Visio and financial software, such as Quicken or Microsoft Money. Although there are Linux applications that claim to provide this functionality, they lag far behind their Windows counterparts and prevent some users from even considering Linux on the desktop.
The most common place you'll find Linux on the desktop is in the call center, where most applications are browser-based and productivity software needs are minimal. E-mail clients that can act as clients for Microsoft Exchange and enterprise-class messaging servers are already available from Ximian (Novell), and the Gecko-based Mozilla/Phoenix and Netscape are fully compliant with W3C standards.
Conversion of the Faithful Fifty-two percent of Linux developers used to develop primarily for Windows, according to Evans Data Corp. |
Emulators such as CrossOver Office and Wine (yes, we know they're really compatibility layers, but they provide the same functionality as emulators) can run some Windows applications under Linux. But like any virtual-machine system, they're slower than native applications and can cause user productivity to drop.
So Where Do I Start?We knew you'd ask, so we've included a workshop on migrating to Linux, including training and certification (see "Are You Experienced?"). But if you're looking for a quick-start guide, here are our top two tips:
NWC Project: Linux A-List NWC Project: Linux A-ListIf you're looking for the perfect Open-Source application for your data-center-centric Linux server, check out our Linux A-List, compiled and maintained with recommendations by Contributing Editor Don MacVittie. |
• Find a project that fits. If there's a fit for the operating system, then make a case for Linux. If not, wait for an opportunity. Don't seek to deploy just for deployment's sake. When project are proposed, consider whether Linux is appropriate.
If you're about to migrate from an older version of Windows to a newer one, or upgrade a Unix system, see whether Linux would serve as well. Many organizations have cut operating costs significantly by replacing a single, expensive proprietary piece of hardware with one or more Intel-based systems running Linux.
• Consider you applications. Make certain that applications are available to replace Windows or Unix programs. In the data center, the application vendor will likely have a Linux version of the software, but always check requirements and availability before you make a move.Remember, Linux--like open source in general--is all about choice. If it makes sense for the business and the bottom line, then don't be afraid to make the move. Linux is data-center ready and waiting for the invitation, but it's up to you to determine where and when to invite it in.
Lori MacVittie is a Network Computing senior technology editor working in our Green Bay, Wis., labs. She has been a software developer, a network administrator and a member of the technical architecture team for a global transportation and logistics organization. Write to her at [email protected].
Post a comment or question on this story.
In May 1999 we called Linux the "Swiss Army knife of networking," but said it wasn't quite ready to take over the enterprise. In June 2000 we made our Chicago labs all Linux, all the time, and said adoption was in full swing. In November 2001 we asked, "Are We There Yet?" and tested enterprise-class distributions not only on the server side but also on the workstation side. And last year we looked at a slew of distros and Linux-based appliances. In fact, our 2003 Well-Connected Award winner--SecureLogix Enterprise Telephony Management System--runs on, yes, Linux.
Now, in early 2004, we can say with certainty that Linux is, indeed, there. The backing of industry heavyweights like Hewlett-Packard, IBM, Novell and Sun Microsystems, enterprise-class management and support, and more than 4,500 supported commercial applications have led three-quarters of our 1,029 poll respondents to declare Linux a strong choice for the data center. In "Inside Linux," we look at the remaining roadblocks and offer tips for deciding on an initial deployment, overcoming resistance and when it's wise to wait. In "Where to Find a Fit," we run down the areas where Linux shines and where you should exercise caution.One of the biggest concerns surrounding open-source software--even before SCO became a household name--is the about the potential to accidentally include open-source code in proprietary software and vice versa. Furthermore, there are more than 60 separate licenses, with more than 45 of them approved by the OSI (Open Source Initiative), each containing different restrictions and requirements (see www.opensource.org/licenses for a list and explanations).
In the face of so many licenses and the possibility that you might not be complying with them, how do you make decisions about deploying open-source software without hiring an army of legal advisers? Black Duck Software may have a solution: Its eponymous software reports on the ramifications--both legal and economic--of combining proprietary and open-source software (the standard edition launched Jan. 21 at LinuxWorld, and the enterprise edition will be available this spring).Say you're working on a specific project and need to determine the best combination of, for example, Apache, PHP, MySQL and JDBC drivers. You can use Black Duck's license calculator to determine, whether royalties must be paid, whether source must be distributed and which licenses are applicable.
This is great for independent software vendors, but how does it help the enterprise? With its 200-GB database containing signatures for tons of open-source and proprietary applications, Black Duck can determine whether open-source technology has made its way into your software and whether proprietary source code has somehow entered an open-source project.
This information can be invaluable--especially for enterprises that outsource projects. Unless your IT staff is going to comb through an outsourced project line by line, it's unlikely you'll discover licensing or IP violations without the aid of software.
Still, this product is not a cure-all. Black Duck does not indemnify or guarantee the results of its product. Rather, its goal is to arm your IT and legal departments with the information needed to make a decision. So while Black Duck will aid you in your quest for compliance and IP protection, you'll still need legal counsel to CYA.An increasing number of both consumer- and enterprise-class devices are running on embedded Linux. Among the growing list: TiVO, Sharp's Zaurus, Sony's HDTV and upcoming PS/X and Linksys' wireless routers. In fact, embedded Linux is now the No. 2 embedded OS, behind Wind River's VxWorks.
But how long will Linux remain in VxWorks' wake? Now that Wind River has joined the OSDL (Open Source Development Labs) and announced its intention to enter into the embedded Linux game, we expect not long.This doesn't sit well with Microsoft, whose embedded versions of XP and Windows CE compete with Wind River and Linux. Microsoft was running neck-in-neck with them in 2002 but was expected by most observers to lose ground by 2004. The company's (understandable) lack of enthusiasm for Linux's growth in this market is evident in its recently funded research that indicates a lower cost for embedded Windows products (see a summary of the report). The software giant, as usual, is attacking development practices and maturity of the competing platform and its network stack.
Microsoft still clearly has an advantage in the tool market. It's difficult to compete with the development environment the company offers, though Wind River's support in advancing Eclipse on the Linux side of the world will give Linux a leg up in this respect. Still, it's difficult to catch up to a company with a long history of delivering integrated, easy-to-use development tools to a variety of markets.
As the embedded OS market booms, especially in the consumer and corporate gadget arena, we expect to see Microsoft and Linux competitors Metrowerks and LynuxWorks fight to the bitter end. Let's hope that developers and consumers are the big winners.
You May Also Like