Video editing has always placed higher demands on storage than any other file-based applications, and with today’s higher resolution formats, streaming video content demands even more performance from storage systems, with 4K raw requiring 1210 MB/sec per stream—7.3 times more throughput than raw HD. In the early days of non-linear editing, this level of performance could only be achieved with direct attached storage (DAS). As technology progressed, we were able to add shared collaboration even with many HD streams. Unfortunately, with the extreme demands of 4K and beyond, many workflows are resorting to DAS again, despite its drawbacks. With DAS, sharing large media files between editors and moving the content through the workflow means copying the files across the network or on reusable media such as individual USB and Thunderbolt-attached hard drives. That’s not only expensive because it duplicates the storage capacity required; it also diminishes user productivity and can break version control protocols. In this blog, we'll look the key differences between major storage technologies and well as general usage recommendations.
Media content consists of both essence (the content itself) and its associated metadata. Everybody acknowledges that the metadata is important to classifying and locating content, so media companies tend to put a lot of thought into collecting and managing metadata — what type of information will be collected, where it will be entered and how often, etc. The idea is to ensure consistent, thorough metadata collection so that users can find and remonetize specific pieces of content. Metadata-gathering is a critical part of the metadata management process, to be sure, but it’s only half the process. What people tend to ignore is the other piece of metadata management — ensuring that the metadata is secure and archived. Why do they ignore it? Because media companies tend to focus so much on securing the actual content that they put little if any thought into securing the associated metadata, which is often stored in another database separate from the content itself. Let's look at best practices for protecting your metadata is to ensure that, while you’re backing up your content, you’re also backing up and archiving your metadata database.
If you were building a police department from the ground up, where would you begin? Where do you go to stock up on holsters, handcuffs, badges, flashlights, guns, dispatch centers, in-car computers, police cars and the myriad other gear required by modern law enforcement? A good place to start is the Police Security Expo, held this year in Atlantic City. It’s like a superstore for police. And Quantum was there, because police departments increasingly need to include storage on their shopping list. This was my chance to check out the latest on-body cameras that have been in the news so much lately. Think of something about the size of a GoPro, but even more heavy duty, and sophisticated enough to actually begin recording 10 seconds before you press Record. The vendors selling these cameras typically had a good crowd of officers being educated on what it’s like to live with them on a daily basis, and the question of storage always came up. It’s a good thing, because law enforcement agencies are routinely generating over 1PB of data a year.
The first time I edited any media, I did it with a razor and some sticky tape. It wasn’t a complicated edit – I was stitching together audio recordings of two movements of a Mozart piano concerto. It also wasn’t that long ago and I confess that every subsequent occasion I used a DAW (Digital Audio Workstation). I’m guessing that there aren’t many (or possibly any) readers of this blog that remember splicing video tape together (that died off with helical-scan) but there are probably a fair few who have, in the past, performed a linear edit with two or more tape machines and a switcher. Today, however, most media operations (even down to media consumption) are non-linear; this presents some interesting challenges when storing, and possibly more importantly, recalling media. To understand why this is so challenging, we first need to think about the elements of the media itself and then the way in which these elements are accessed.
If you’ve worked in storage for decades as I have, you’ve heard all the debates about which storage works best for each step in media workflows. But one thing that’s clear is that not every step has the same storage requirements, and that some kind of tiered storage strategy is needed. With every-expanding digital asset libraries, storing it all on high-performance disk isn’t practical or cost-effective. Traditional tiered storage is straightforward: store the most active, most recently used data on fastest, expensive disk storage, and store the less active, older data on slower, less expensive storage, generally tape or lower cost disk arrays. Hierarchical storage management (HSM) software was built to automate the data migration between tiers, and make file system access transparent regardless of where the data is stored. When the primary storage filled to a capacity watermark, for example, 90% of capacity, the HSM system would find the files that were least recently used and move them to the secondary tape tier until the disk storage had sufficient available capacity. This model of tiered storage was built for business data where the focus was containing costs. Disk storage was expensive, tape was cheap, and older business data was rarely relevant except for an occasional audit. The criteria was simply performance vs cost. But media workflows don’t manage business data. Here are the 3 biggest considerations for developing a new approach to workflow storage.
Video production is entering yet another major transition – the move to 4K. Much like the move to high definition (HD) several years ago, the new ultra-high definition (UHD) 4K-resolution formats have the potential to disrupt workflows, strain existing infrastructure and require costly unplanned upgrades. Those who remember how bumpy the change from SD to HD was are understandably nervous about what this looming 4K transition will bring. With lessons learned from the past, the industry is ready to make the change from HD to 4K. The technology has evolved, the tools have evolved and workflows have evolved. The challenge, however, is to make sense of all this change and put the right pieces together to enable a successful transition. The following five key tips will help you to make a smooth transition to full 4K production.
Remember those heady days of standing up your first SAN? In those days SAN’s were were small, and likely built up with 2Gb FibreChannel and 250GB hard drives. We found a way to make those small SANs work because we were likely ingesting from camera tape systems - and writing back finished project files to tape as well. It was chaotic – but it worked – and we evolved ever more elaborate file and folder structures to keep track of projects, customers and assets – and a growing shelf of tapes that we hoped were cataloged and tracked correctly. As simple file based workflows gave way to the modern, content-centric workflow model - several key lessons emerge. Here's the biggest lessons and how to understand them so you can "evolve beyond the adhoc SAN."
I’m on my way to Washington, D.C. for the GEOINT 2015 Symposium. I’m looking forward to spending time with customers and GEOINT’ers, along with the rest of the Quantum team. If you’re at the event, drop by Quantum booth #8058 to learn about the latest techniques in geospatial data management. I’m especially excited to attend GEOINT this year because I’ve been selected to give a Lightning Talk at the GEOINT Foreword Pre-Symposium. I will have 5 short minutes to tell a technology story and keep the audience engaged. No pressure. The title of my talk is: “Connecting Dots In Today’s World—Crayons Not Included.” Why focus on “connecting dots?" Well, the goal in GEOINT today is the same as ever: to derive useful intelligence from data. And while innovation is everywhere—in small sats, in 4K video, in sensors, in mobile, and in the analytics and data viz software that enables discovery—all this innovation is creating a ton of data that needs to be managed, analyzed, and connected. So it makes sense to talk about the challenges people face connecting all of these dots. And the bigger challenge they face recognizing which patterns of connected dots are meaningful.
Yesterday was an exciting day for Quantum. We had the honor of ringing the Closing Bell® on the New York Stock Exchange to commemorate the 35th anniversary of the company’s founding. It was a great feeling to represent the entire Quantum team as we marked this significant milestone and looked forward to building on our rich heritage. The company has changed significantly over the years, but here are 4 areas that have remained constant.
Quantum is ringing the closing bell at the New York Stock Exchange today in celebration of its 35th anniversary this year, so I’ve been thinking recently about the history of the storage business. It’s been a very interesting ride. Most extraordinary is the degree to which the role of data storage has changed. Data storage have moved from the edge of the data processing universe to being firmly at the core. Data storage has gone from being a peripheral playing a supporting role to being the belle of the IT ball. It’s an exciting time (again) to be in the data storage industry, particularly for Quantum because of the central role our StorNext scale-out storage solutions have long played in enabling customers to organize, protect and manage their data so they can leverage it for strategic advantage. As Quantum commemorates its 35th anniversary, we look forward to helping more organizations maximize the value of their data.
What industry has the most demanding workflow? There are some good contenders, and Sports Broadcasting is among the heavyweights. Capturing the drama and excitement of live sports has become the ultimate high-wire act in modern television production. Consider the pressures of covering a live event with no second takes, millions of highly discriminating and knowledgeable customers scrutinizing your every move, and that every play has the potential to make history. Take a look at the "Evolution of Sports Broadcasting" and see for yourself where the future of the industry is headed.
It’s so exciting to see today’s announcement that Quantum will be taking on a larger role in selling Dot Hill storage. This is not just another storage channel deal – let me explain why. The partnership of Quantum and Dot Hill began in workflow storage. Quantum enjoys a unique position in the market in serving the needs of the most demanding workflow storage applications – from video production to surveillance, cybersecurity and research. These applications have storage needs which are quite different from traditional corporate applications. They need different solutions. Both companies have noticed that there are adjacent markets and customer needs where tighter collaboration between us—leveraging the technology of Dot Hill and the technology and channel reach of Quantum—can deliver broader value to our customers and partners. Which brings me to the announcement today: Quantum is becoming the branded face of Dot Hill storage.
Anybody who has anything has something worth stealing. Today’s advanced cybersecurity threats are putting CISOs on the hot seat. And while detection and prevention remain the staples of security, effective incident response has become critical to the bottom line. When (not if) you are breached—how will you investigate, and how will you respond? This post explores 7 important questions that every Chief Information Security Officer must be able to answer about incident response.