On January 12th, we announced our new solution for offsite DR storage using Amazon’s cloud infrastructure – Q-Cloud Protect for Amazon Web Services. There’s a great article about it in eWeek here. Offering a product that runs in Amazon’s cloud infrastructure is a brand new type of offering for us, and I’ve learned a lot over the past year and even in the past few weeks speaking with customers, partners and analysts about this new product. I’ve spent some time speaking with firms of all industries and sizes – private and public universities, small financial services firms, multi-national distributors, government agencies and contractors, and more. Here are some takeaways...
The pressure is real. It’s hard enough to remain efficient in an industry where higher resolutions, new camera formats and a growing range of delivery options are emerging faster than ever before. Add in the complexity of sharing content across distributed teams and archiving content securely, and many traditional workflows fail to stay efficient, at a time when efficiency matters most. Tight project timelines aren’t getting any longer just because your workflow can’t keep up. It’s time for the cloud. Here's how collaboration and archive are heading to the cloud.
I recently had the pleasure of participating in the launch of Quantum’s new cloud-related services and products – Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. I participated by joining some panel discussions with Quantum’s Geoff Stedman, Senior Vice President, StorNext Solutions, and Bassam Tabbara, Executive Director, Cloud Services. One of the over-riding themes in our panel discussion was the idea of putting data where it makes the most sense from a cost and performance perspective. If this sounds a lot like the good --well, not really – old days of hierarchical storage management (HSM), that’s because the basic concept is the same. One of the differences today is that we now have the cloud as another tier in the storage hierarchy.
Businesses of all size are increasingly starting deployments of cloud-based data, driven by the promises of greater agility, lower management cost and capital savings. It just makes sense. When both compute and data move together – in lockstep – to the cloud, the issues to be considered are very similar to deploying or migrating an onsite application. But when the major compute operations are staying onsite and only the data is moving offsite (such as for backup, disaster recovery or compliance archive), the deployment can be more complex. In this scenario, operational executives must consider five key issues to ensure a successful experience – including meeting customer service level agreements (SLAs) and staying within budget. Based on the use case you are planning, you must consider these 5 issues.
Data protection strategies have been in a state of accelerated evolution over the last five years. I hear this confirmed regularly by customers describing their implementation stories with Quantum, as well as by the industry analysts that we meet with to discuss our latest product innovations. ESG’s Jason Buffington is one of those analysts that we talk with often, and it’s always interesting to see how ESG’s research squares against what we’re seeing in data centers. Jason’s latest video blog about modernizing data protection – 8 Suggestions for Every Data Protection Strategy – highlights ESG research that resonated with me in a number of respects.
Being the “Cloud Guy” at Quantum, I get to talk to a wide variety of people about what’s happening in the cloud, from the wildly optimistic visionaries to the skeptics in the wondering, “Is my data really safe?” This week the visionaries got a hard reality check when Nirvanix abruptly announced plans to shut down their cloud service, giving customers and partners just two weeks to find another place for their petabytes of data. The cloud still offers enormous benefits, but I think the Nirvanix example is a great reminder that not all clouds are created equal and there are key considerations companies need to thoroughly evaluate.
There was plenty of chatter last week following Amazon’s introduction of their new cloud data archival service, Glacier. No wonder. In Silicon Valley we spend a lot of time sorting through “bright shiny object” technologies, but in the end it’s frequently about value, and $.01/GB/month sounds pretty good. If you’re looking for consumer grade storage with somewhat relaxed retrieval times and security requirements Glacier’s hard to beat for infrequently accessed data.But companies with enterprise-class backup and restore requirements don’t have hours to access their data, they have seconds. Maybe they’re looking for a disaster recovery solution that takes advantage of cloud resources. For enterprise-class customers considering cloud solutions for their data protection, one of the first questions to ask is, “How easily can we recover our data?”