Image Engine Consolidates Data Center, Ramps Up Storage For Performance Improvement

Known for its graphics rendering work for the Stargate TV series, The Hulk, Blades of Glory and other video productions, Image Engine pushed storage and processing to the point where a re-assessment of IT architecture was needed--as well as a concerted strategy for reducing equipment footprints in the data center.

October 1, 2009

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Video rendering is the generation of an image from a three-dimensional object model, which is a highly defined data structure capturing information about the image such as texture, lighting and shading. The use of video rendering today is central to the production of films, television shows and other video products--and is a specialized discipline requiring skilled artists and graphics engineers as well as the "right stuff" for computer-based processing and image  storage.

Known for its graphics rendering work for the Stargate TV series, The Hulk, Blades of Glory and other video productions, Image Engine wanted to further improve its production operations for the amount of rendering and the number of people doing the rendering in its heavy production schedule. The  success of Image Engine's rendering business, coupled with growing demand, had pushed storage and processing to the point where a re-assessment of IT architecture was needed--as well as a concerted strategy for reducing equipment footprints in the data center.

"We wanted to make the video rendering environment for our employees and our contractors more robust, in addition to creating an IT infrastructure that could be accommodated in a data center facility that was not that large," said Terry Bates, Image Engine's head of systems. "We operate in a Linux environment, with our production work being done on 20 Mac workstations and our administrative and accounting work being done on 90 computers in a Windows environment. We also have over 200 HP quadcore blade servers that support our operation, and that run Linux. We are progressively transitioning these blade servers to a 64-bit operating system."

To improve IOPS (input/output per second) performance in production, Image Engine made the decision to move to a BlueArc  Titan platform for its storage needs. "We had worked with BlueArc before, so it was easy to go that way," said Bates. "The BlueArc head can handle more storage, it fits neatly into the rack, and it also takes up less space in the data center. We had considered other storage solutions, but BlueArc gave us over 100,000 IOPS at peak, with scalability that would allow us to further increase IOPS as our operation continued to grow."

Image Engine's production environment features two BlueArc Titan 3200 heads, with each head handling different functions. "We use one head exclusively for rendering, and one for our other users," said Bates. "Each head handles 180 terabytes of storage, with the capability of peaking at a 220 terabyte load. With the BlueArc equipment, we use a combination of fibre channel and SATA (serial ATA) storage. The fibre channel storage stores mostly the data that we are currently working on in production, and the SATA houses data that is less often used or accessed. In both cases, we use hard drive (HDD) media, with most of the individual drives running at 10K RPM. "Having strong failover capability was paramount to the way Image Engine architected its storage. "We achieve high failover and continuous production by using constant data replication from fibre channel to SATA," said Bates.  "This data replication runs all of the time and updates every hour."

A second critical component was automated storage management.  "We used to have to go into our systems and create all of the file systems manually," said Bates. "Now, we have scripts in place that automatically perform the directory structuring and that then populate the files within the structures. The system automatically produces reports as it monitors system performance, and it sends alerts whenever there are issues that require our direct attention."

Bates said that attractive pricing was a consideration in adopting BlueArc--but so was a project timeline that seemed to make the transition very achievable. "We started the project in September of 2008 and completed it in June of 2009," said Bates. "This project required one to three of our people, depending on which phase of the project we were working on. Throughout the project, BlueArc always had two or three people engaged or available."

Image Engine's project timeline included planning, setup, cutover and support. Implementation went as planned, with a few minor surprises that were quickly resolved by the project team. "Initially, we encountered some trouble with snapshots as we moved data over," said Bates. We also had a couple of problems to solve once we were in production, but we got around these issues without too much disruption, and stayed within our project timeframe."

With its storage project now behind it, Image Engine is well positioned to meet its video rendering performance requirements and to scale upwards as more storage is needed. "We've got ease of use, performance and no needs for heads over the next year," said Bates, "Most importantly, we have the scalability we need to easily augment storage in the future without having to initiate major upgrade projects."

Read more about:

2009
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights