Learn From the Past, Master the Future!

When dealing with modern IT, the best lessons are probably behind you

August 12, 2004

3 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Computing history offers great insight for today's data center managers, especially those of you who are constantly running out of floor space for equipment, frazzled by data backup concerns, weary of heightened security measures, and reminded of it all by the ever-annoying end-user.

Care to reminisce?

In the old days, computing centers held huge mainframe computers that filled rooms, floors, and buildings. These beasts served hundreds or thousands of users. Getting access to the computer involved deploying batch entry stations (a.k.a. keypunch machines), teletypes, or dumb terminals all over the place. Sometimes terminals were spread throughout an enterprise, requiring long cable runs or even leased bandwidth connections for remote terminals.

The client-server paradigm is just a reinvention of the old paradigm [ed. note: or what our primitive forbears naively called a "model"] of mainframe-terminal computing. An enterprise used to have one large mainframe with massive processing power and lots of I/O channels. Everyone connected to the mainframe through a terminal. We're returning to that paradigm, with thin clients that access data on an enterprise-class server or on disk arrays in the data center.

Today, lack of real estate is still a problem. What used to take rooms of computing power can easily fit into an equipment rack or two. Disk drives used to be the size of washing machines, now they fit in your coat pocket. But still, data center managers must cope with what always seems like not enough space for all the hardware that they must manage.Ensuring data integrity is another facet of computing that hasn't aged well. Presumably, most managers recover from backup tapes (you do keep backups, don't you?) or some other media, perhaps CD-R or DVD-R. But what's the longevity of optical media? Is it as reliable as nine-track tapes that quaint though they may be – are generally still readable even 30 years after they were first written to?

Security, too, has never ceased to be an issue. Administrators of the past had to contend with malcontents trying to gain access to their mainframes, for whatever purposes, over dialup modem lines. These days, the battlefronts are network links to the Internet. Firewalls and password authentication servers are nothing new. They are simply updated versions of technology that administrators of the past relied on to keep their computing center secure.

Another thing that hasn't changed are the end-users. They're still a pain in the ass, complaining about the network being too slow, or not having enough hard drive space, or causing the data center manager grief with their sometimes adorable, but mainly annoying, cluelessness.

As you progress in your career, keep this article in mind. Technologies that appear on the scene and are touted as "new" often are drenched in decades-old computing concepts. Be mindful of ways in which old and/or outmoded technology concepts can solve a problem you are confronted with. You might just be the one to apply an old technology in a new and perhaps novel way, and then pass it off as "new (and improved)."

For example: look for computing power to continue shifting towards a centralized server, then back to the user's domain, until computing power is eventually spread out over every device on the network, at which point the network will become the computer. This is the paradigm being touted by some futurists, and I think it will eventually come to fruition.Other patterns like this can be discerned by learning about antiquated computer technology. So, study your computer history as a manner of normal practice. You not only learn a lot and gain a better understanding of modern technology, but you'll actually be surprised at the number of innovations that came well before you thought they did.

A good place to start your studies is on the on the Web and by reading the IEEE Annals of the History of Computing, published quarterly.

— Sellam Ismail is a historical computing consultant who runs VintageTech and produces the Vintage Computer Festival

Read more about:

2004
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights