As movie theater goers watch the Incredible Hulk take on his nemesis Abomination during a New York city rooftop battle in "The Incredible Hulk," few, if any, will be thinking about the storage power necessary to make the computer generated movie scenes.
In taking on its largest movie project yet, Soho VFX, a Toronto-based visual effects creator, knew it would need more capacity than on earlier projects. Berj Bannayan, Soho co-founder and software engineer, said the decision to go with a Blue Arc Titan 2000 was based on reliability and speed.
Computer-generated imaging (CGI) files are massive to store and retrieve. Quick rendering the pulling and storing images for artistic creation is required for smooth scene development and video editing.
"There was a certain complexity in that Hulk sequence on the rooftop, and just a huge amount of data was needed. We needed something to quickly move those digital assets from storage to rendering," said Bannayan.https://o1.qnsr.com/log/p.gif?;n=203;c=204655439;s=10655;x=7936;f=201806121855330;u=j;z=TIMESTAMP;a=20400368;e=i
Film generation and video production are two industries grappling with unique storage challenges, as the data files are huge. Blue Arc is just one of several storage vendors pushing big boxes and new storage technologies to try and ease the pain points tied to massive data piles.
On "The Chronicles of Narnia," a previous film project, Soho's data storage topped out at about 16 terabytes. The Hulk project was going to be double the number of scene shots and image files as "Narnia," to roughly 150 shots.
The Hulk project required what Bannayan called a "constant flood of data" as well as 24-hour rendering work at the end of production. Creating sequences required simultaneous split-second demands for hundreds of gigabytes of data-intensive image files as well as intermediate files created during 3D animation. It involved loading 700 gigabytes of color-related data repeatedly in any given week.
"The time in pulling images from stored archives plays a critical role in editing, as it can make the process more efficient and faster," explained Bannayan.
The BlueArc Titan 2000 series server, which anchors the visual effects studio's production technology infrastructure, expanded the studio's disk capacity from 16 to 38 terabytes of Fibre Channel storage.
Titan supports five gigabits per second of throughput performance at the entry level or 10 Gbps of data throughput in a high-performance configuration. The object-based file system scales to two petabytes and supports virtual volumes containers for managing data capacity which can expand to meet allocation needs.
When the visual studio was founded six years ago, there were four staffers, one server, and a six-CPU storage farm. The homegrown storage approach quickly needed to expand, but that was easier said than done.
"That approach did not scale easily or well," said Bannayan, adding that today there are 100 workstations and 38 terabytes of storage in play.
The dynamics of special effects and animation technologies bring unique storage challenges, according to John Affeld, senior director of product marketing at Blue Arc.
"There is a life flow to production, and peak and demand times can be acute in the rendering process," said Affeld. "The goal is reducing and eliminating wait times for digital artists."
"It's called the coffee effect, where artists have to wait for a file and so they go get coffee. With quicker retrieval, they have more time to do what they need to and the storage aspect does not interfere with the product rendering process," he said.
The Titan also offers easy management tools, which Affeld said helps reduce IT work for small companies that may not have storage administrators.
In fact, Bannayan has pretty much served as the de facto storage leader at Soho.
"We have a small system staff that handles servers and network, but we don't need someone to take care of the Titan as it's self-maintaining, which helps make us more efficient," said Bannayan.
Article courtesy of Internet News