When the Lights Went Down in the City

It's a year since the juice ran out in NYC and IT managers recall how they (and their systems) survived

August 14, 2004

3 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Tomorrow will be the first anniversary of the blackout which wreaked havoc in New York and other cities across the Eastern seaboard. One year on, NDCF checked out how IT managers in NYC coped when the juice ran out.

Just how big a deal was it? Well, perhaps prompted by the events of 9/11, it appears that most firms already had good disaster recovery plans in place. But, wary of future threats, they are still hard at work to improve their defences.

Cross Border Exchange, a New-York based ASP that provides technology for securities trading, has overhauled its disaster recovery and evacuation planning procedures since the blackout. All of the company's development file servers, for example, are now connected to an Uninterruptible Power Supply.

The firm still emerged unscathed from the blackout, thanks to the fact that its data center was located on a separate power grid. Staff in the companys locations outside the city simply took over the running of the business while the New York office was out of commission.

However, the company's COO John Entwistle did not exactly emerge unscathed. As head of operations, the unfortunate exec had to make sure that the New York site was secure, climbing 27 flights of stairs in the dark with his flashlight to check that a door was locked. “The exercise was good was for me,” he says, still catching his breath.Elsewhere in the city, over at Mi8, a Microsoft Corp. (Nasdaq: MSFT) Exchange hosting company, the elevators (not to mention the air conditioning) were also out of commission. But like Cross Border Exchange, Mi8 managed to keep its main systems up and running.

”We didn’t experience any interruption of our services last year because all of our services are based in AT&T Corp. (NYSE: T)'s data center in Manhattan,” says Patrick Fetterman Mi8’s VP of marketing and products. “There was zero downtime on the servers.”

According to Fetterman, the AT&T data center has more than 24 hours worth of battery power onsite and up to three weeks worth of diesel fuel for its generators.

Fetterman says that the blackout actually helped Mi8 win some new business, when some companies realized the shortcomings in their own backup and recovery operations.

Like thousands of others, Fetterman walked home across the Brooklyn bridge on the day of the blackout and has mixed feelings about the events of August 14, 2003. “I don’t look back on it fondly, but it did prove the value of the outsourced model,” he says.For staff at software development company Data Synapse it was a similar story. Although non-core systems such as desktops went down in the company’s fifth floor offices on Broadway, failover mechanisms moved key customer support systems over to different power grids.

After the events of 9/11, Data Synapse’s top priority was making sure that staff were safe, according to the company’s COO Frank C. Cicio, Jr. “After the experience of 9/11 we wanted to make sure that no one was in trouble, and make sure that nobody was stuck in an elevator or anything like that,” he says. “I am glad to say that no one was injured,” he adds.

— James Rogers, Site Editor, Next-gen Data Center Forum

Read more about:

2004
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights