One of the great challenges for Technicolor is that, on any movie, the work is often completed in different geographic areas. Sound work might be completed in London, for example, while the final rendering is done in Los Angeles. As a result, the distribution network must run smoothly, with no dropped frames. This is compounded by the fact that there are often many versions of the same film—producers and directors often want to look at different versions of scenes and choose the one they like.
As you can imagine, the content for upcoming films also has to be secure. Technicolor uses a secure transmission system from Aspera to make sure no one can steal, say, the third installment of the Dark Knight franchise. Instead of being audited by a banking regulator, Technicolor opens its IT infrastructure to inspection by studios, who make sure individual workers can’t offload a film to a personal drive or transmit files over the network without strict clearance.
One lesson for any IT shop, says Davis, is to let broadband carriers bid on services. This has the distinct advantage of lowering costs and providing flexibility for projects. Davis also takes the need for low latency very seriously. One movie alone can use up to 8 TB per file—double that for a 3D movie. Where you locate your data centers has a great impact on latency. (Fortunately, in Los Angeles, there are plenty of options for dark fiber that Technicolor can tap at any time for boosts in speed.)
Technicolor also uses a fabric virtualization tech from Xsigo. This lets the company connect data centers as though there is one main connection, not 40 different connections. This makes provisioning simpler and easier, and it results in fewer hardware switches and ports. The network is thus defined primarily by software to link all of the data centers for smooth transmissions.
Livestream: Boosting Scalability with 750 TB NAS Fliers
Livestream, as its name implies, is known for live streaming events. The company can process many terabytes of data for a single event, such as Whitney Houston’s funeral. Livestream also handles user authentication and other back-end responsibilities. In many cases, the level of interest surprises the company—for example, 200,000 users signed on almost immediately, with no prior warning, to watch Houston’s funeral.
Slideshow: Massive Data Centers
Nicholas Tang, vice president of development operations at Livestream, says the data processing needs can be jaw-dropping. The site might have 200,000 concurrent users viewing the same stream, and the data center has to process this material at speeds up to 2.5 Gbps per user to keep the video smooth. For an event that size, Livestrean may push more than 150 Gbps of traffic.
IT executives often have to deal with perceptions of file transmission accuracy and circumvent any downtime—but in the video world, everyone knows when there are dropped frames. To address these needs, Livestream uses the EMC Isilon scale-out NAS storage platform. This uses an intelligent management system to provide the speed and storage for particular IT projects on the fly. Martin Libich, senior technical consultant at EMC Isilon, says Livestream has unique needs. They site need to scale up and down very quickly—just 10 seconds of HD video is the equivalent of accessing and storing 500,000 pages of documentation, he says.
The filers that handle data serving can traditionally only handle about 100 TB of data before having to manage multiple volumes, which can slow performance. Livestream uses Isilon to solve this event data processing problem. It’s enough to handle even the most intense live event, from a Justin Bieber concert to a presentation.
Tang says one of the main challenges in his industry is just making sure that an event can scale. This requires planning and understanding both the bandwidth requirements and the vendors involved. Rather than experience surprise when a content delivery network can’t handle exorbitant traffic, Livestream will first determine and then monitor potential traffic upticks.
Nice Shoes: Moving Data Centers to Amazon’s Cloud
Movie studios aren’t necessarily the most technically advanced when it comes to IT infrastructure. Some are designed to deal only with directors, producers and actors, with the hardcore post-production work farmed out to special effects shops like Industrial Light and Magic.
For Nice Shoes, based in New York City, one of the greatest challenges face is making the data center as secure as possible. This is incredibly important when you are working on a music video for Lady Gaga or putting the finishing touches on a major feature film. At the same time, as with any creative effort, the company tends to collaborate with clients over the Web.
Nice Shoes uses the DDN S2A9900, a scalable storage appliance that can re-create massive data sets as needed using erasure-coding algorithms when the data is not available from primary disks. The company also uses the Amazon EC2 cloud storage service.
In the News: Amazon Web Services Simplifies Creation of Private Clouds
Robert Keske, the CIO of Nice Shoes, says using scaled storage is a must for creative work.
“Our design and CGI departments develop a tremendous amount of modeling and animation that requires a fair amount of rendering (computing),” Keske says. “In the past, that meant building out massive render-farms. In most cases, the sheer capital expenditures and timeframes required to build such a render-farm did not make much business sense, which resulted in us turning down work just because we did not have the infrastructure in place to be able to perform the work efficiently.”
The elasticity of Nice Shoes’ storage has changed how the company works. Remote offices can connect over extremely high-speed networks as though the designer is sitting in the next cubicle over. When employees need more storage, the cloud bends to their will and provides the automated allocations.
“We have built a [virtual private cloud] tunnel from out NYC offices to Amazon and developed EC2 virtual machine nodes that are configured with rendering software as packages, along with process automation of scripts to turn on the environment as needed without engineering involvement,” Keske says. “Our designers, animators and CG artists submit their render needs and the Amazon VPC cloud automatically starts up the required EC2 nodes and server, computes, then delivers it the results directly to our NYC servers.”
What is the overall lesson? Nice Shoes—and other creative companies with massive computing needs—have shown how elastic storage, multi-carrier arrangements, secure transmissions and private networks can help IT fuel the creative needs of a company. Many of the same principles apply to any business—especially in our current economic and quickly changing technical climate.
John Brandon is a former IT manager at a Fortune 100 company who now writes about technology. He has written more than 2,500 articles in the past 10 years. You can follow him on Twitter @jmbrandonbb. Follow everything from CIO.com on Twitter @CIOonline, on Facebook, and on Google +.