If you’ve worked in storage for decades as I have, you’ve heard all the debates about which storage works best for each step in media workflows. But one thing that’s clear is that not every step has the same storage requirements, and that some kind of tiered storage strategy is needed. With every-expanding digital asset libraries, storing it all on high-performance disk isn’t practical or cost-effective.

Traditional tiered storage is straightforward: store the most active, most recently used data on fastest, expensive disk storage, and store the less active, older data on slower, less expensive storage, generally tape or lower cost disk arrays. Hierarchical storage management (HSM) software was built to automate the data migration between tiers, and make file system access transparent regardless of where the data is stored. When the primary storage filled to a capacity watermark, for example, 90% of capacity, the HSM system would find the files that were least recently used and move them to the secondary tape tier until the disk storage had sufficient available capacity.

This model of tiered storage was built for business data where the focus was containing costs. Disk storage was expensive, tape was cheap, and older business data was rarely relevant except for an occasional audit. The criteria was simply performance vs cost.

But media workflows don’t manage business data.

Media workflows are much more complex, with requirements that vary based on how the content is used at each stage in the workflow. Different workflow users and applications have different storage requirements that go beyond performance, and older content shouldn’t necessarily be shipped it off to a deep archive.

The solution isn’t to tier based only on performance vs cost, but to align the type of storage to user and application requirements of the particular workflow stage, although it just happens be most cost-effective that way.

Beyond Tiered Storage Quantum StorNext
Beyond Tiered Storage Quantum StorNext

Online storage for content production

High-resolution content production requires low-latency, high-performance storage that can stream HD, 4K or greater content to multiple workstations without dropping frames. Approximately 700 MB/sec read or write performance is needed per user or application to stream files at 2K resolution or above. These levels of performance require not just the low-latency of storage, they also require a low-latency protocol designed for streaming, such as Fibre Channel or Infiniband. The real-time workflow operations with these requirements include: edit, color correction, EFX, audio sweetening, live ingest, and finishing. Recommended online storage includes SSDs and high-speed disk storage, with hybrid solutions leading the way.

Extended online for monetization

The value of content doesn’t end after initial delivery. A wide variety of new delivery channels and platforms and supplemental content—3rd screen, over-the-top viewing, behind the scenes, alternate endings and other special features—allow content to be monetized indefinitely. For the workflow, that means far more transcoding and delivery than ever before. Transcoding and delivery servers should be connected storage that can deliver 70-110 MB/sec with high IOP performance for much smaller files, often only 4-8K in size. These workflow operations are best suited for storage over IP connections. Object storage is recommended for extended online, particularly solutions that are tightly integrated with online storage.

Long-term archive for protection and preservation

Whether legacy content is being actively monetized or not, it must be preserved and protected from loss. As digital asset libraries grow, scalability, durability and economy of storage are paramount.  At the same time, asset libraries must be searchable and readily accessible through media asset management. If content can’t be easily located and accessed, it can’t live up to its potential. And for disaster protection, this storage should support some level of geographic dispersion, either through copies, replication or more sophisticated methods. Buildings can burn or flood. If content isn’t properly protected it can be lost forever. Recommended storage for long-term archive includes object storage and LTO/LTFS-based tape libraries.

Finally, aligning content and storage with to the needs of particular workflow steps is not enough.  No workflow storage solution with multiple storage types is complete without automation and strong connections between the storage types. Like the old HSM systems, you still need automation to migrate and manage this new mixed workflow storage. The difference now is that the policies and connections need to be more sophisticated to meet the greater demands of today’s sophisticated workflows.

At Quantum, we’ve moved beyond traditional tiered approaches to deliver workflow storage that’s intelligent and aware of the needs of the distinct workflow stages. The StorNext 5 platform lets you align your storage, whether it’s SSDs, disk arrays, object storage or LTO tape, to meet the needs of the distinct steps in your media workflow. All while managing it intelligently as a single infrastructure.

To learn more about how StorNext 5 goes beyond tiering, attend Alex Grossman’s keynote address at the Creative Storage Conference in Culver City, California on June 30. Or follow along as I live-tweet his talk and others on Twitter at @janetlafleur511.

Ready to Learn More?

Red Bull Media House needed to choose a new workflow management system and they went with Quantum StorNext® software. The result?  Easier collaboration and shorter content production time. Check out their Customer Story here.

Red Bull Media House Quantum StorNext
Recommended Posts

Leave a Comment