One of the challenges an administrator at a content company stumbles with when beginning to implement a MAM-based workflow with a tiered storage solution is which one to lead with. It’s almost like new dance partners stumbling over each other’s feet. Even in well-integrated solutions, where the MAM vendor has coded to the archive vendor’s APIs, there is room for conflict. Does one stick solely to the MAM software itself to drive the archival and retrieval of content to the repository? Or does one complement this with policies in the storage archive software itself to automate the archival? Folks in the Hollywood area have an opportunity to learn more about MAM, archive and workflow storage in a live demo event at MelroseMac’s offices on June 9 with BlackMagic Cameras, Cantemo and StorNext. Read on and RSVP for the event.
There’s never been a bigger rush to transcode and deliver content worldwide to more distribution channels than today. A broad range of new delivery platforms and new audiences can bring new value to legacy content, but only if your workflow supports it. If you’re not ready to release content quickly when a new distribution opportunity arises, re-process it for special features, or even re-use content in a new project, you’re leaving money on the table. And that’s not so easy to explain to your boss or your investors. Unfortunately, most workflows are poorly set up to access, transcode and deliver content created years ago. The good news is that StorNext 5 workflows built with Quantum Lattus have specific capabilities that enable real-time and non-real-time operations to occur efficiently in the same storage infrastructure. Here's how.
As you may have heard already, there’s exciting news today in the object storage marketplace: Western Digital Corp., a leader in storage technology, announced that its HGST subsidiary is acquiring Amplidata, Quantum’s object storage technology partner. We’re happy for Amplidata and looking forward to expanded partnership opportunities with WDC and the HGST group. As a reminder, Quantum announced in 2012 that we were leveraging the performance and availability of Amplidata’s object storage technology by embedding it in our Lattus family of unique active archive solutions. Since that time, many Quantum customers have been able to increase the value of their data by extending cost-effective online access to massive volumes (PBs) of information, so let's look at how this announcement is great news for three major reasons.
As a marketing professional for over 20 years, I’ve seen many trends come and go. But one thing that hasn’t changed is the overwhelming sense of accomplishment you feel when your marketing message resonates with customers – when you hear them play back your value proposition in their own words with tangible examples of how you’re enabling business growth. It’s a victory that says we listened, we heard correctly, and we nailed it. This is especially true when you’re carving a new direction, such as launching a new offering that challenges the legacy way of doing business. I recently had one of those experiences when a group of Quantum folks had dinner with one of our customers, the fastest growing sports organization in the world.
Capturing the drama and excitement of live sports has become the ultimate high-wire act in modern television production. Consider the pressures of covering a live event with no second takes, millions of highly discriminating and knowledgeable customers scrutinizing your every move, and that every play has the potential to make history. Yet this competitive pressure has evolved the modern sports production workflow in to a marvel of efficiency where creative storytellers wrap the drama of the moment with compelling storylines, insights and statistics that deepen our appreciation of the game and can dip into an ocean of past games and content to round out the storylines of the moment. The challenge of course is this: How can you manage the crush of new content, make sure its tightly integrated with your existing asset management and production automation, and scale to meet your projected growth without disrupting your operations?
Content production had never been a simple process, but the number of moving parts and scale involved has grown to global proportions. Even a low-budget film might shoot in the rainforest in Costa Rica, edit in Vancouver, add visual effects in Korea, color-correct in Toronto, and finish in Hollywood. At the same time, there’s more pressure to transcode and deliver content worldwide on more platforms that ever before. All of this without the added complexity of making and transmitting duplicate copies between remote teams. That’s where cutting edge storage technologies and workflow automation tools head to the cloud with StorNext in the Cloud. StorNext in the Cloud lets you use the same workflows you use today, but now you and your team can work remotely, sharing content stored on Quantum’s Lattus object storage with all the scalability, flexibility and security you need, automatically managed by StorNext Storage Manager. With a StorNext and Lattus cloud-based workflow there’s no need to integrate your workflow into a cloud that’s not built for media. Instead, you can keep using the tools from the broad ecosystem of StorNext solution partners.
By Bob WientzenPosted
I know a lot of folks think the big contest this time of year is the Super Bowl playoffs. In Quantum’s Denver, Bay Area and Seattle offices we’re sporting the colors of the Broncos, 49ers and Seahawks, with just a bit of friendly trash talk to kick off conference calls. Perhaps you know someone rooting for New England – I don’t. But if you care about data storage, the other big contest is Storage Magazine’s Product of the Year Awards. The award serves as an annual reminder of what the storage community found important, promising, and profitable. This year’s award finalists include a cross-section of Quantum products spanning scale-out shared storage and the data center, highlighting the breadth of innovation from the company over the last year. For 2013, four Quantum products – more than any other vendor among the finalists – have been selected in three award categories.
By Bob WientzenPosted
Astronomers searching for life outside of our solar system speak of The Goldilocks Zone – the region around a star where conditions are suitable for sustaining life: not too close and hot, and not too distant and cold. Initially these “just right” conditions appeared to be almost impossibly rare, but researchers over the years have found organisms that can exist in more conditions than previously imagined. It turns out that the Goldilocks Zone is wider than we thought, increasing the possibility of finding other planets capable of sustaining life. Today a similar recognition is happening in data centers. While IT has long thought of data storage as “hot” and requiring immediate access in flash memory or primary disk, or “cold” and suitable for backup and archive to tape, there weren’t many choices for a “warm” tier of data that required a more nuanced cost/latency balance. The expanding range of choices such as public and private cloud, object storage and LTFS tape has in effect created a wider Goldilocks Zone for data centers. The refreshed thinking about the capabilities of both established and emerging technologies for these different tiers of storage has been getting a lot of attention lately.
The creation and acquisition of massive amounts of content has become easier than ever. With the introduction of new digital acquisition technologies (from video cameras to sensors) and increasingly sophisticated data analysis tools, the way we handle and save our data is changing. The true value of information will evolve over time. For example, real-time data and historical data can reveal unexpected results. Old video footage can be compiled and digitized from archives to capture a previously insignificant moment in time. For businesses that rely on data to identify trends or repurpose content for monetization, there is a need to keep all of this forever.
Being the “Cloud Guy” at Quantum, I get to talk to a wide variety of people about what’s happening in the cloud, from the wildly optimistic visionaries to the skeptics in the wondering, “Is my data really safe?” This week the visionaries got a hard reality check when Nirvanix abruptly announced plans to shut down their cloud service, giving customers and partners just two weeks to find another place for their petabytes of data. The cloud still offers enormous benefits, but I think the Nirvanix example is a great reminder that not all clouds are created equal and there are key considerations companies need to thoroughly evaluate.
After publishing my blog yesterday on the need for application support of object storage to break the logjam in adoption….it occurs to me that some of you may be asking the question: “Janae, if object storage really is so cool and the gap in object storage adoption is data mover application providers writing to this new technology, why haven’t these developers quickly moved to fill this gap?”
There seems to be wide agreement across the industry that object storage has the potential to provide major value to customers, particularly as customer data scales to reach petabytes of valuable – often distributed – content across a wide range of customer environments. So there was an interesting discussion last week at the Next Generation Object Storage Summitabout what’s inhibiting the adoption of object storage across the industry. After a day and a half of (sometimes quite lively) discussion between analysts and industry participants, the top three inhibitors were summarized as: (1) general market awareness; (2) customer education about where the technology fits; and finally, (3) the availability of ‘on ramps’ to the technology, namely applications that will write to it.
The convergence of backup and archive is a really hot topic right now. Quantum, as well as some of our partners in the industry, are introducing some great capabilities that are really bringing together backup and archive. An extreme view on this topic is that batch backup is a thing of the past, and for some data types and use cases there is some truth to that. But to unpack this a bit we need to look at use cases and even specific data types.