Fortifying Your Network-Access Control

Passwords are still the dominant method of securing data, but with blurring network borders, higher stakes transactions and stricter regulations, you need more powerful tools to prevent unauthorized access.

January 28, 2005

15 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Identity and access management was a $2.21 billion market in 2003, according to IDC, which expects the market to grow to $3.5 billion by 2008. This spending will be split among various pieces of authentication technology, from user information databases to authentication servers, and middleware to hardware tokens. As the fragments are integrated, authentication will be strengthened by requiring two, three or more factors to assert identity, and by SSO (single sign-on), in which multiple passwords and logins associated with different networks and applications are replaced by a one-time authentication at the beginning of a user's workday.

The Evolution of Identity

In the beginning was the user name, and it was good--for a while. Then came the password and with it, single-factor authentication--your identity is ensured by something known (theoretically) only to you. For the majority of organizations we polled, authentication still requires solely a user name and password, and for some applications, that's enough. However, when companies try to make passwords more secure by requiring frequent changes and to make passwords stronger by requiring a mix of numbers and characters and banning words found in common dictionaries, they often run up against the limits of human memory. Users may write their passwords on Post-its or forget their passwords and place calls to the helpdesk--calls that costs, according to industry estimates, between $10 and $35 each. Add to that the fact that passwords are prone to theft when written down, used in some remote and wireless network-access applications, or attacked through worms or keystroke-logging spyware, and the need for another level of identity assurance is clear.

That bring us to two-factor authentication, which adds something you possess--usually a hardware authentication token--to something you know. This is the setup more companies are moving toward as they seek to replace the requirements of strong passwords with the security of a single-use PIN token.The most common two-factor authentication tokens are small devices from companies such as ActivCard, Aladdin Knowledge Systems, RSA Security, SafeNet, Secure Computing and Vasco. These devices generate numeric codes that are valid for a limited time or a single use. Some systems require the user to type a challenge string into the token before the passcode is generated, but the level of security for both types is considered similar.

Neither type represents the future of two-factor authentication, according to Steve Hunt, vice president and director of research for security at Forrester Research. "Every token available is a stopgap or migration step towards smart cards," Hunt says.

We agree: Smart cards have the advantage of being multipurpose and can provide physical-premises access along with network and application authentication. They're also familiar to users, resembling credit cards in form and function. So why haven't they become the norm? Because, unlike simple hardware tokens, they require a card reader--a peripheral not yet standard on most enterprise workstations. Until companies like Dell and IBM include readers in every laptop computer and corporate desktop keyboard, hardware tokens, whether handheld or USB, are going to be a primary two-factor authentication method.

Beyond Two

When two-factor authentication isn't enough, a third factor--something you are--is added using biometrics, or identification by way of biological characteristics, such as voice response or retinal scan. Vendors are evaluating ways to make this technology more economical and widely available through devices like USB fingerprint scanners. Right now, though, biometrics is sufficiently expensive to make it of interest only to those securing very high-value information, as in the government and financial sectors.In addition, the National Institute of Standards and Testing cites wide variations in the accuracy of fingerprint biometric systems. NIST's most recent testing yields some interesting results. For example, multiple-finger recognition is much more accurate than single-finger recognition. Perhaps more important, the quality of the fingerprint images stored in the matching database has a greater effect on results than the quality of the authentication scanner (see full results of NIST's tests at fpvte.nist.gov/index.html). Although smaller fingerprint scanners are coming down in cost, capturing fingerprints, tuning the database and using the biometric scanner is an expensive proposition that can be justified only when the systems and data protected have an exceptionally high value.

Finally, though more secure than passwords alone, biometric information is not immune to theft--as we proved (see www.nwc.com/ 910/910r1side1.html), a stolen fingerprint molded into a rubber doppelganger can fool some biometric scanners, and a fingerprint cannot be reset like a password.

Virtually all operating systems and many applications include facilities for single-factor authentication. The same RADIUS used by many standalone authentication systems, for example, is common for these authentication systems, so why would anyone look beyond what's built-in? The reasons include added security and user convenience, but to understand the value of separate authentication services, you must first grasp the benefits and cost of authentication.

It's hard to tally the value of information lost to unauthorized network access--estimates range from millions of dollars to the approximate mineral value of Neptune. Fact is, statistics don't mean nearly as much as the value of any given loss when it's your loss.

Consider these four issues when building a case for strong authentication--which put justification in the realm of risk management and assessment: required confidence (security) level; transaction value; user impact; and deployment and maintenance costs.The first two items are tightly linked and bear directly on the benefit of the system. At one end of the scale are transactions that have little economic cost and, therefore, require little in the way of identity confidence--logging into the free wireless network at the local library, for example. As the transaction value goes up through the layers of e-commerce to include large institutional financial transactions, the level of identity confidence required rises in lockstep. Of course, benefits must be weighed against cost, the second two components on the list.

As for the TCO (total cost of ownership) of authentication, people tend to rush toward the hard dollar costs, including the license for the authentication server and individual hardware tokens. But from the user perspective, the experience impact, or PITA (pain in the ass) factor, is the lion's share of the cost. Some organizations, for example, up the PITA element by requiring multiple, difficult-to-remember passwords that change frequently. Go this route and users eventually will try to work around the authentication system. Although it makes it easier for the user, this approach inevitably leads to increased helpdesk calls.

Every authentication system carries costs, even if you're using the authentication capabilities included in your network operating system or enterprise application. Authentication supported within this framework is still the norm, with Computer Associates, IBM and VeriSign the leading vendors. And even with the simplest authentication schemes, developing a user database, assigning privileges, training and supporting users, and maintenance costs must be factored in.

Minimizing the price of authentication will help push two major trends in this segment in the coming years: The move away from individual built-in authentication for the enterprise and an increased reliance on smart cards and other hardware authentication methods, such as tokens, that can be used anywhere. The virtues of built-in authentication technology are that you can buy it once and be done with it, and the purchase price is normally very low. But these benefits are frequently balanced by higher integration costs as staff or consultants strive to knit the authentication databases and mechanisms into a single coherent scheme while keeping the various authentication requirements from running into one another when accessed by high-demand users.

By the NumbersClick to Enlarge

On the flip side, authentication products designed for integration into multiple networks and applications, such as Lucent NavisRadius, Novell Nsure SecureLogin, RSA SecurID and Secure Computing SafeWord PremierAccess, carry the standard software purchase and maintenance costs but compensate with facilities that make it easier to integrate the authentication process into multiple software platforms (for a review of several of these authentication products, see "Not Just a Token Effort,").

Reducing integration costs will take on greater importance as organizations get serious about SSO and identity federation (more about them later), and as users force their employers to get serious about minimizing the PITA factor of strong authentication. The push to lower integration costs should lead more organizations to explore third-party authentication systems, which offer richer feature sets and the ability to meld multiple authentication transactions into a single user experience.

The same factors that attract organizations to third-party authentication will make smart cards a strong two-factor authentication option. Users are comfortable with the format, so internal training and political costs will be low. Beyond the human factor, smart cards are already being used in commerce and premises security applications, so integration into other transaction capacities should be easier than with simple hardware tokens or USB devices. With the addition of RFID capabilities, smart cards can be used for proximity authorization, providing access to devices that don't include keyboards or conventional card readers.

The regular cycles of security purchasing mean an increased interest in authentication is on the horizon, according to Forrester's Steve Hunt, who adds that security deployment moves in waves of authentication, authorization, administration and auditing. He says we're in the audit portion of the cycle, with systems being put into place to demonstrate compliance with laws protecting customer and patient information. As the audit phase passes its peak, the authentication phase begins in earnest; Hunt says Forrester expects the cycle to crest in late 2005 through 2007.

Fortunately, standards bodies have begun acting to bring some regularity to the market, and products adhering to some of the first standards should be in place soon. One example, OATH, the Open Authentication Reference Architecture, developed by the Initiative for Open Authentication, is a proposed standard for strong authentication under development by an industry consortium initiated by VeriSign and joined by vendors including ActivCard, Aladdin, Aventail, BEA Systems, Hewlett-Packard and IBM. Some industry leaders, such as RSA and Secure Computing, have not yet joined the consortium or made public statements about OATH, which is unfortunate, because a true industrywide standard for hardware tokens would be a huge win for customers.The OATH reference architecture relies on two-factor authentication. A number of second factors can be used, ranging from one-time passwords to hardware tokens, smart cards and SIM cards like those used in cell phones. The goal is an open architecture in which components, such as tokens, manufactured by one vendor will be usable to authenticate to a server created by another vendor. IBM and Aladdin have announced products supporting the standard, and the consortium says it will submit the proposed standard to the IETF for formalization. The prospect is intriguing.

Sign Me On

Although two-factor authentication gets a great deal of attention, strong passwords, safely stored and transported, can provide sufficient security for many environments. What makes a password strong? We've mentioned requiring a mix of numbers and other characters, with no component of the password matching a string found in a common dictionary. The password hash must be stored in a secure, encrypted database, and the password shouldn't be passed "in the clear" during remote access.

The problem of end users forgetting their passwords or recording them in an insecure fashion is compounded if the organization has multiple applications, each of which requires its own strong password. For these environments, SSO is a critical step forward in balancing security requirements with user needs. Indeed, as authentication becomes stronger and the possibilities for standards grow brighter, more companies are beginning to consider enterprise SSO--a system in which all networks (wired, wireless and VPN) and all applications are authenticated from user credentials stored during a single login at the beginning of the user session.

SSO's major stumbling block has been technological--how do you pass authentication information between networks and applications, and how do you securely store authentication information from a network login to be used for applications later in the session? SSO is one topic that quickly leads to discussions of dedicated security products. Computer Associate's e-Trust, for example, has an SSO module that works within the CA enterprise framework. Other vendors--including IBM, Novell and Sun--use the more expansive phrase "identity management" to build in SSO capabilities.Policy management plays a role as well, because devices authenticate themselves to the network or applications just as users do. In the environments described by Cisco's NAC (Network Admission Control) and Microsoft's NAP (Network Access Protection), for instance, devices that access network services must authenticate and develop a level of trust similar to that attained by users (for a rundown of NAC and NAP, see "What's on the Horizon?" at www.secureenterprisemag. com/0202/0202rd1.jhtml).

There is, as yet, no significant proposal to merge device and user authentication on the network. But that hasn't stopped architects from looking at the next level of authentication unification--identity federation.

SSO Without Borders

What if your business partner's network accepted the SSO information from your network login? What if your supplier's order-entry screens accepted your identity from your network as you connected over the intranet? These scenarios are examples of identity federation, a plan in which one enterprise trusts another to properly authenticate and authorize users. The issues here are philosophical and legal, in addition to technical. The benefits, however, are significant, as critical passwords and user information are stored only once and not communicated across possibly insecure links. Instead, networks accept authentication-verification tokens from one another as proof that the user's identity has been established to a satisfactory level.

Microsoft has pushed identity federation at the consumer level, with its .Net framework and Passport services. The difficulty in establishing the relationships has been demonstrated by companies such as eBay, which backed out of the Passport alliance after Microsoft made changes to the technology framework. Competing identity federations, such as the Liberty Alliance (founded by Sun and including financial services providers American Express and Fidelity), have their own standards that don't recognize or interoperate with Passport. Just as users have become comfortable with smart-card forms and technologies, it's possible the consumer-oriented identity federations will provide frameworks that can be used by businesses.A global identity federation, in which a user identifies once and is recognized by all networks and applications, is a lovely concept, but for now, secure authentication that provides an acceptable level of identity assurance for one organization at a time seems a more reasonable goal. A move to strong passwords and on to two-factor authentication, with an end-game of enterprise SSO, is an economical and technologically feasible path for many companies--and a move you should be making or at least planning for.

CURTIS FRANKLIN JR. is a senior technology editor for Secure Enterprise and Network Computing. He was founder of the BYTE Testing Lab, director of labs for Client/ Server Labs and managing editor/ technology at InternetWeek. He has been writing about the computer and network industries since 1985. Write to him at [email protected].

As identity management becomes a more well-developed concept, the lines between authentication, authorization, directory services and policy management will blur. But for now, passwords are still the dominant method of securing enterprise data. That's not a comforting thought for security pros who realize that higher transaction values, increased customer and shareholder concern, and potentially punitive regulations make it vital to know, with certainty, every user's identity.

In "Strong Authentication," we explore the possibilities for fortifying network-access control. Whether you use beefed-up passwords, USB tokens, smart cards, biometrics or some combination thereof, you can move toward a safer network. To reduce the inherent complexity, we also consider SSO (single sign-on) and the progress on standards.

In "Not Just a Token Effort,", we put five enterprise-class, strong authentication systems through extensive tests. These products enable authentication through token use (proprietary or third-party) and can work with any common enterprise directory, providing a path to SSO. ActivCard's ActivPack AAA Server 6.3, Funk Software's Steel-Belted Radius 4.71, Lucent Technology's NavisRadius 4.0, Novell's Nsure SecureLogin 3.5 and Secure Computing's SafeWord PremierAccess all performed well, but NavisRadius blew us away. Make no mistake--you must know what's going on between the various pieces of your authentication infrastructure to take advantage of all the functionality NavisRadius offers. But if you do, the system's interface and scripting language make for an incredibly flexible experience. In addition, the product costs a pittance given its abilities, earning it not only our Editor's Choice but our Best Value award to boot.


Read more about:

2005
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights