Standards Matter: The Battle For Interoperability Goes On
We all say we want our gear to work together, but are you willing to hold vendors accountable for breaking faith?
April 16, 2009
Used to be, vendors didn't brazenly fracture standards. Sure, they sought lock-in opportunities, but most knew that if they played too fast and loose, the market would mete out punishment, as in the '90s when TCP/IP rule breakers lost sales.
Times have changed, and not for the better. Take network access control. Cisco has all but abandoned its NAC framework and partner program. Microsoft threw some of its Network Access Protection specifications to the Trusted Computing Group, but Cisco consistently has refused to even acknowledge the TCG's legitimacy. So much for interoperability.
Want more? First proposed in 2004, 802.11n was hung up as Wi-Fi Alliance members hashed through competing technical interests. The widely used 802.1X is being revised because critical features were missed the first time. In the realm of cloud computing, you can't get two people to agree to a definition, much less what should be standardized, as evidenced by the recent finger-pointing around the IBM-led Open Cloud Manifesto initiative.
And this lack of interest in creating functional, universal standards seems to be accelerating. Cisco's EnergyWise, which proposes building-wide energy management, should have gone to the International Telecommunication Union three years ago. And how confident are you that Fibre Channel over Ethernet will be more interoperable than Fibre Channel?
If we're not careful, standards for nascent technologies could be so splintered as to be worse than none at all.
There's plenty of blame to go around, starting with the big vendors that try to game the process. "The larger vendors know the 'flaws' in the current system," says David O'Berry, director of IT systems and services for the South Carolina Department of Probation, Parole, and Pardon Services. "They know it takes awhile for things to progress--especially when you want it to take awhile--and so they use that gap to create de facto lock-in at critical junctures."
For their part, vendors counter that standards bodies have devolved to the point that they're almost immobilized by politics and squabbling. Consensus can take years, and the market won't wait that long. "Standards bodies tend to be more focused on the process than achieving the desired result in the shortest time possible," says Mike Healey, CTO of GreenPages Technology Solutions and an InformationWeek Analytics contributor.
Case in point, says Healey, is the 802.11n wireless standard. The a, b, and g iterations had hit a performance wall that was hindering business, but the new spec languished for a year in a draft format that didn't significantly change from its final release. "Vendors that were willing to 'cheat' and release products based on the draft established a competitive advantage," he says. "Those that followed the rules were left behind. The IEEE delays were less about technical specifications or design issues but revolved around meeting schedules and documentation timelines."
Healey says he asked Aruba Networks what it would do about its prerelease 802.11n product if the standard changed. "Two words: 'firmware upgrade.'"
Another high-profile standards failure is browser support for HTML and Cascading Style Sheets. Designers who don't know--or care--about the implications of proprietary extensions to HTML spew out Web sites that work only in Internet Explorer for Windows.
IT organizations bear some of the blame as well. We state that standards compliance is a nonnegotiable check box in purchasing decisions, yet we haven't been consistent in insisting on adherence. Storage networking is a prime example. Even though Fibre Channel is an ANSI standard and, in theory, any FC device should communicate with any other FC device, the reality is that vendors competing for storage area network switch and director market share have little incentive to interoperate. Customers take the path of least resistance by purchasing from certified product lists rather than selecting components based on price or feature set.
Is it a coincidence that high-end storage is about the most expensive technology that IT purchases--with the least number of competing products? We think not. So will we let the next big thing in storage, Fibre Channel over Ethernet, follow the same path? Why not insist on an FCoE Alliance with a logoed testing program?
DIG DEEPER
State Of Storage
Balky SANs aren't all we have to worry about.
See all
InformationWeek Analytics Reports
Of course, standards aren't magic pixie dust. IT organizations often can't get even supposedly compliant products to interoperate without hacks and workarounds. But that's no reason to throw up our hands and write off the process. We need to advocate for a smart, independent standards track.
"The secret sauce to a successful 'working standard' isn't necessarily IETF or another longstanding body," says Jonathan Feldman, director of IT services for the city of Asheville, N.C., and an InformationWeek Analytics contributor. "Rather, an earnest and honest effort by a group that has governance outside of a single corporation's control is what's important."
Along with vendor independence, interoperability testing is critical. We have interoperable 802.11 wireless products not just because there's a standard, but because vendors backed the Wi-Fi Alliance and decided to cooperate. As the Wi-Fi Alliance certification took hold, it started showing up as a requirement in requests for proposals, which motivated more vendors to participate and get certified. The result: Certified WLAN products interoperate--unlike SANs.
Hot-Button Issues
Cloud computing and green IT are two of the hottest IT areas and two hotbeds of standards angst. The cry for cloud computing standards in particular has reached a fever pitch in recent weeks since a group of vendors released the Open Cloud Manifesto, an outline of core principles intended to boost interoperability among various cloud computing technologies.
Critics charged that IBM and other vendors developed the manifesto behind closed doors and then tried to foist it on the industry. Microsoft, which said it agreed in principle with most of the manifesto, was nonetheless disturbed by the lack of openness in the process (the irony isn't lost on us) and called for more open discussion. Amazon.com's response was more measured, noting that the company has offered Amazon Web Services APIs in various languages and formats based on customer demand.
People should calm down. Frankly, the manifesto was simply a statement of intent with a diverse set of backers, including AT&T, F5 Networks, Hyperic, IBM, and SAP. What isn't clear to us is whether cloud computing standards are necessary.
That's not to say we don't need specs for the technologies that keep clouds aloft. Some standards work for virtualization, for instance, already has been completed. The Distributed Management Task Force announced in March the Open Virtualization format, which standardizes the virtual machine file format and schema description and is backed by VMware and Citrix Systems. Microsoft says it will support the spec in Hyper-V but hasn't provided a timeline.
In the green IT market, Cisco came under fire for not taking EnergyWise--its program for monitoring, managing, and reducing power consumption that it says was three years in the making--to a standards group. The company counters that when it started on EnergyWise, the goal was simply to manage the power consumption of Power over Ethernet devices, but over time the scope broadened to include more devices, data gathering, and management.
What Cisco ended up with wasn't what it started with, says Hugh Barrass, a Cisco technologist responsible for standards, who adds that the vendor has every intention of submitting the EnergyWise work to a standards body. "Digging in our heels isn't beneficial to anyone," Barrass says. "What we learn from EnergyWise implementations we will take to the standards bodies to create effective standards."
HOT OR NOT?
HOT
Data Center Bridging Lossless, high-speed Ethernet with flow control? Good for storage. Good for data. Good for you.
802.1X-REV 801.1X becoming a more usable protocol. Finally.
ODF/OOXML Standardized file formats for seamless import/export? Love it, and the death match to dominate is good TV.
NAT66 Transport-agnostic IPv6-to-IPv6 translation. Apparently, new scheme won't solve old problems.
NOT
Cloud Standards We know there's been a lot of chatter, but first define cloud, then we'll talk.
802.11v Wireless network management via Layer 2 seems like a good idea, but is it necessary?
DNSSEC Yeah, we need it, but infrastructure and operations aren't even close to ready.
Common Event Expression A common event format and taxonomy should be hot, but no way is this getting into products.
Competitors counter that Cisco's aim was to get a jump on everyone else--there are few green IT standards on the table, outside of the IEEE 802.3az Energy Efficient Ethernet task group, which doesn't seem to be gaining much traction. And in fact, Cisco does benefit from being first out, if for nothing else than bragging rights if EnergyWise proves successful.
Gray Areas
Standards bodies aren't a panacea. Some specs designed from the ground up in these groups are still incomplete. 802.1X, which was ratified in 2001, defines host authentication to authorize use of a port. Unfortunately, the spec defined port use as all or nothing, either open or closed, which means those hosts unable to authenticate, such as guests or devices that don't support 802.1X, couldn't connect at all, at least according to the standard.
To support guest access, Hewlett-Packard and other vendors added proprietary capabilities to their switches. Cisco went a step further, allowing its Cisco Discovery Protocol to pass through the port to discover a host, such as a voice-over-IP phone, before the port is authenticated. Both functions are necessary, and even though they don't comply with 802.1X, they don't break interoperability, either. The IEEE is now working on a revision to 802.1X that enhances the protocol based on needs discovered in field deployments.
That example just gives credence to vendors' favorite argument: Technology must be developed and deployed with live customers before functionality can be standardized.
"As standards grow larger in scope and become more complex, it becomes difficult to make fundamental changes to the underlying architecture," says Paul Congdon, CTO of HP ProCurve and a longtime representative to several standards groups. "This is similar to the software problem of adding new architectural constructs to mature operating systems." The answer, many industry players argue, is targeted industry consortia that can perform in-depth testing while maintaining compatibility.
Standards Equality
Veteran standards bodies such as the IEEE, International Telecommunication Union, and Internet Engineering Task Force have established procedures for taking a standard from a twinkle in an engineer's eye to publication. However, with the fast pace of technology innovation, work in these bodies can drag on to the point where the standards they're developing aren't delivered until long after demand has peaked.
"The time to build standards is when knowledge is high, because you know what functions need to be worked on and politics are low, because no one has an entrenched stand to defend," says Steve Hanna, a distinguished engineer with Juniper Networks and another longtime member of various standards groups. That can be a narrow window that vendors often step through with proprietary functionality.
One answer: small, targeted industry consortia, like the Trusted Computing Group and the Metro Ethernet Forum, that can move faster than the large standards bodies. Hanna, who's co-chair of the Trusted Network Connect and IETF Network Endpoint Assessment working groups, says the TNC completed initial work on its NAC spec in just about a year. Industry consortia are more nimble than the big bodies because their working groups are smaller and more focused. Also, they provide interoperability testing and certificate programs and coordinate work with other entities.
The Metro Ethernet Forum provides a good example of how to work with veteran standards bodies, vendors, and customers (in this case, carriers). The MEF decided early not to be a standards body, but to act as a liaison between its members and standards bodies and to communicate customer needs to the standards groups, says Craig Easley, VP of marketing at Matisse Networks and co-chair of the MEF North American Marketing Committee. The MEF recruited vendors and customers to become members. Carriers started making MEF certification a requirement in their requests for proposals, and now carriers are seeing MEF certification requirements showing up in RFPs from prospective enterprise customers. The result: a set of products that interoperate, reducing costs for all involved.
You Say Tomato ...
Standards bodies try to limit overlap and redundancy. Competing specs for the same functions benefit no one; proposals should be fought over in a single work group and a winner chosen based on technical merit. The IETF, IEEE, and others follow those guidelines and reuse each other's specs where it makes sense. They aren't in competition.
Unfortunately, politics can get in the way when a vendor has such a large stake in a format that it simply won't budge. Case in point: The ITU's International Organization for Standardization (ISO) published the spec for the Open Document Format, or ODF, a file format developed by Sun Microsystems for its Star Office/OpenOffice. ODF is missing some critical components, like a definition for related apps such as spreadsheets and presentation files, but Oasis, the sponsoring organization, is working on those. Microsoft, with its huge investment in its own software, countered by submitting its own Office Open XML (OOXML) proposal to the Ecma International standards group, which in turn submitted it to the ISO, which published the standard in 2008 amid much controversy.
Vendors have a narrow window, Hanna says
"Multiple standards coexist in many industries," says Jean Paoli, general manager of interoperability strategy for Microsoft. "Customers benefit from choice and functionality in their document formats, such as HTML, PDF, Open XML, and ODF. Standardization of Open XML by ISO and IEC ensures access and opportunity to all." What Paoli fails to point out, however, is that HTML, PDF, and OOXML have very different use cases. OOXML and ODF directly compete, and having two competing, noninteroperable formats is no benefit.
The ISO sidestepped the issue, stating that competing standards aren't unprecedented and that the market should decide. By that logic, since most of the world uses Microsoft Office applications, Redmond's formats are the de facto standard.
But critics take issue. "De facto standards are contradictory because they are held by one company and implemented only by those allowed to implement them, and the permission to do so can be changed," says Louis Suarez-Potts, community manager for Sun's OpenOffice.org.
Microsoft won't fully implement ISO Office Open XML until its next version of Office, Office 14. The European Union is pressuring Microsoft to support ODF. OpenOffice already supports OOXML, and Microsoft wrote a module for Office 2007 to read and write ODF.
Yeah, it's a mess.
The takeaway: If you're guilty of relegating standards support to a "nice to have" feature rather than a requirement, you're part of the problem. If you want products to interoperate, be prepared to walk away if a vendor can't prove compliance. Don't be brushed off with promises of standards support "on the road map." The alternative is vendor lock-in and higher costs, including the cost of maintaining systems that don't work together. Standards bodies are imperfect and must do better. The alternative: splintered networks and broken promises.
Read more about:
2009About the Author
You May Also Like