I recently had the pleasure of participating in the launch of Quantum’s new cloud-related services and products – Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. I participated by joining some panel discussions with Quantum’s Geoff Stedman, Senior Vice President, StorNext Solutions, and Bassam Tabbara, Executive Director, Cloud Services.
One of the over-riding themes in our panel discussion was the idea of putting data where it makes the most sense from a cost and performance perspective. If this sounds a lot like the good –well, not really – old days of hierarchical storage management (HSM), that’s because the basic concept is the same. One of the differences today is that we now have the cloud as another tier in the storage hierarchy (as well as the fact that, unlike some of those old HSM products, today’s migration software actually works).
A key point here is that the cloud is an optional tier that may – or may not – make sense in the context of your performance and cost – and security — objectives. If on-premise disk or tape makes more sense for you, it’s ok to ignore the cloud (at least for now). But if you find the economics of the cloud cannot be ignored, then it’s time to add it to your storage tiering options.
In any case, end-user study data from 451 Research suggests that although few IT organizations can ignore the economic advantages of the cloud, few are comfortable with putting all of their data in the cloud. As such, we think that the hybrid approach to the cloud will be the way to go for most organizations, at least for the foreseeable future. Our research also shows that, although users are welcoming hybrid cloud architectures with open arms, they want it to be as non-disruptive as possible, particularly for the non-IT parts of their organization. And that means that the existing workflow has to remain the same. In the case of Quantum’s approach to integrating the cloud into storage tiering, that’s possible thanks to the StorNext file system, which essentially enables administrators to set storage tiering policies that align with their existing workflow.
Virtually every storage vendor is in some way incorporating the cloud into their architectures; in that sense, Quantum is no different. But what could set Quantum apart is the crown jewel or secret sauce — or whatever other well-worn metaphor you choose – behind its tiered architecture: the StorNext file system, which enables automation of the data migration process.
This fits into our general theme of keeping it simple. End users already have enough fears – justified or not — about the cloud (security, complexity, hidden costs, loss of “control”) and we just can’t add to those fears.
Talking about hidden costs: Early adopters of public clouds have found that they often run into hidden costs, such as tack-on charges for data retrieval, added costs for capacity that exceeds what they thought they would need, etc. That’s why I like Quantum’s approach of masking the complexity and hidden costs often associated with public clouds under a relatively simple usage-based service that’s managed by Quantum. In its initial implementations, Quantum will be using Amazon’s AWS public cloud as the back-end infrastructure for storage data, although I understand the company plans to add other public cloud options in the future (which is a good idea because, from an end-user perspective, it’s all about choice).
Assuming that Quantum can stick to its “keep it simple” plan as it rolls out the new services, it should go a long way in allaying some of the fears that end users have about embracing the cloud.
Want to Learn More?
For businesses that are starting to use the cloud as an extension of their own IT environment, backup and disaster recovery are great applications for the cloud and cloud-based services. Learn how to use a hybrid approach for on-premise backup and off-site cloud DR storage here.