Got content? Then you need storage. Like the Gecko commercial says, “Everyone knows that.” But does everyone know what type of storage they really need? Do they really know what questions they should ask to avoid potential content workflow disasters? Get the answers to the right questions at the Creative Storage Conference.
Content production had never been a simple process, but the number of moving parts and scale involved has grown to global proportions. Even a low-budget film might shoot in the rainforest in Costa Rica, edit in Vancouver, add visual effects in Korea, color-correct in Toronto, and finish in Hollywood. At the same time, there’s more pressure to transcode and deliver content worldwide on more platforms that ever before. All of this without the added complexity of making and transmitting duplicate copies between remote teams. That’s where cutting edge storage technologies and workflow automation tools head to the cloud with StorNext in the Cloud. StorNext in the Cloud lets you use the same workflows you use today, but now you and your team can work remotely, sharing content stored on Quantum’s Lattus object storage with all the scalability, flexibility and security you need, automatically managed by StorNext Storage Manager. With a StorNext and Lattus cloud-based workflow there’s no need to integrate your workflow into a cloud that’s not built for media. Instead, you can keep using the tools from the broad ecosystem of StorNext solution partners.
In Newtonian mechanics, momentum has a direction as a well as magnitude. If Newton was correct, and I am going to go out on a limb here and assume that to be the case, then the Powered by Quantum MSP Program has momentum, with positive direction and high magnitude. Over the past couple of weeks, Quantum has successfully created partnerships with a number of MSPs that deliver their own cloud backup service powered by Quantum technology. Just this month we added two new MSPs to the roster, Elanity Network Partner and Interconnekt. These partners, scattered across the globe, have recognized the benefits that Quantum solutions can bring to not only their customers but also to their bottom line.
This article originally appeared on Wired Magazine’s Innovation Insights. With the start of the new year, it’s time once again for those of us in enterprise storage to look ahead and offer our predictions for what the industry will see in 2014. So without further ado, here are ten trends that will have a big impact in the coming year.
I know a lot of folks think the big contest this time of year is the Super Bowl playoffs. In Quantum’s Denver, Bay Area and Seattle offices we’re sporting the colors of the Broncos, 49ers and Seahawks, with just a bit of friendly trash talk to kick off conference calls. Perhaps you know someone rooting for New England – I don’t. But if you care about data storage, the other big contest is Storage Magazine’s Product of the Year Awards. The award serves as an annual reminder of what the storage community found important, promising, and profitable. This year’s award finalists include a cross-section of Quantum products spanning scale-out shared storage and the data center, highlighting the breadth of innovation from the company over the last year. For 2013, four Quantum products – more than any other vendor among the finalists – have been selected in three award categories.
With the introduction of cloud there has been a lot of talk, including jokes about how one can get started in the cloud. We see customers all the time trying to figure out what they can do from a cloud strategy perspective and how this will impact (positively or negatively) their current infrastructure, mainly around budgets. Cloud certainly has the ability to provide some relief in terms of finances – allowing you and your team to focus on more strategic projects – so why not get started with using cloud technologies, particularly when it comes to backup. Quantum recently announced a cloud based backup program for MSPs and VARs that delivers a number of fantastic benefits. Read more to learn more.
As the volume of data has increased, there has been a shift in the way that companies use and access that data. That means it’s time to change the way you think about data protection, retention, and accessibility. Organizations of all sizes recognize that data can help gain competitive advantages and even support new revenue streams, but this is placing a demand on IT to store and preserve access to that data. Companies need new solutions and technologies to support unpredictable, on-demand access and incorporate new approaches to backup and archiving. It’s time to reTHINK Backup & Archive.
The creation and acquisition of massive amounts of content has become easier than ever. With the introduction of new digital acquisition technologies (from video cameras to sensors) and increasingly sophisticated data analysis tools, the way we handle and save our data is changing. The true value of information will evolve over time. For example, real-time data and historical data can reveal unexpected results. Old video footage can be compiled and digitized from archives to capture a previously insignificant moment in time. For businesses that rely on data to identify trends or repurpose content for monetization, there is a need to keep all of this forever.
Being the “Cloud Guy” at Quantum, I get to talk to a wide variety of people about what’s happening in the cloud, from the wildly optimistic visionaries to the skeptics in the wondering, “Is my data really safe?” This week the visionaries got a hard reality check when Nirvanix abruptly announced plans to shut down their cloud service, giving customers and partners just two weeks to find another place for their petabytes of data. The cloud still offers enormous benefits, but I think the Nirvanix example is a great reminder that not all clouds are created equal and there are key considerations companies need to thoroughly evaluate.
There seems to be wide agreement across the industry that object storage has the potential to provide major value to customers, particularly as customer data scales to reach petabytes of valuable – often distributed – content across a wide range of customer environments. So there was an interesting discussion last week at the Next Generation Object Storage Summitabout what’s inhibiting the adoption of object storage across the industry. After a day and a half of (sometimes quite lively) discussion between analysts and industry participants, the top three inhibitors were summarized as: (1) general market awareness; (2) customer education about where the technology fits; and finally, (3) the availability of ‘on ramps’ to the technology, namely applications that will write to it.
On behalf of everyone at Quantum, we would like to offer sincere congratulations and a big high five to NASCAR for winning the prestigious IBC2012 Innovation Award in the Content Management category! Did you know that NASCAR events are broadcast to 150 countries? It’s not surprising that people around the world enjoy the fast-paced sport of auto racing, especially when it’s delivered in such a compelling fashion. With 18 high def cameras trackside and more cameras on board some of the cars, NASCAR goes to great lengths to make at-home audiences feel like they are part of the action.
There was plenty of chatter last week following Amazon’s introduction of their new cloud data archival service, Glacier. No wonder. In Silicon Valley we spend a lot of time sorting through “bright shiny object” technologies, but in the end it’s frequently about value, and $.01/GB/month sounds pretty good. If you’re looking for consumer grade storage with somewhat relaxed retrieval times and security requirements Glacier’s hard to beat for infrequently accessed data.But companies with enterprise-class backup and restore requirements don’t have hours to access their data, they have seconds. Maybe they’re looking for a disaster recovery solution that takes advantage of cloud resources. For enterprise-class customers considering cloud solutions for their data protection, one of the first questions to ask is, “How easily can we recover our data?”
How do you back up your virtualized infrastructure? Do you even back it up at all? Virtualization brings forth new challenges and opportunities for backup and disaster recovery. Traditional methods are inefficient and difficult to use with virtualized infrastructure, and cloud solutions, while interesting, raise questions about complexity, risk, and cost. What if you could deploy a simple, low-risk, software only, Back Up as a Service model to your virtualized infrastructure at a compelling price?