The cloud era is here. Across many industries, organizations are eagerly implementing private cloud environments and signing up for public cloud services to take advantage of the agility, flexibility, scalability, and cost-saving benefits that this technology can deliver.
Recently, I climbed on stage to moderate a panel discussion on “infrastructure vs. cloud” at the Technology in Government conference in Canberra, Australia. My panelists ranged from first-line government IT managers to heavy hitters like Barbara Cohn, the first chief data officer of New York state.
Many companies are getting caught up in the hype of moving to the cloud, and in their initial pursuit they discover some of the hidden issues and costs that are otherwise not obvious. There are many services of great value in the public cloud: software, storage, infrastructure, and more, and the development of these services has triggered a rush to the cloud. However, just because we can outsource these services doesn’t mean that we always should, as noted in a recent article.
On January 12th, we announced our new solution for offsite DR storage using Amazon’s cloud infrastructure – Q-Cloud Protect for Amazon Web Services. There’s a great article about it in eWeek here. Offering a product that runs in Amazon’s cloud infrastructure is a brand new type of offering for us, and I’ve learned a lot over the past year and even in the past few weeks speaking with customers, partners and analysts about this new product. I’ve spent some time speaking with firms of all industries and sizes – private and public universities, small financial services firms, multi-national distributors, government agencies and contractors, and more. Here are some takeaways...
The pressure is real. It’s hard enough to remain efficient in an industry where higher resolutions, new camera formats and a growing range of delivery options are emerging faster than ever before. Add in the complexity of sharing content across distributed teams and archiving content securely, and many traditional workflows fail to stay efficient, at a time when efficiency matters most. Tight project timelines aren’t getting any longer just because your workflow can’t keep up. It’s time for the cloud. Here's how collaboration and archive are heading to the cloud.
As you may have heard already, there’s exciting news today in the object storage marketplace: Western Digital Corp., a leader in storage technology, announced that its HGST subsidiary is acquiring Amplidata, Quantum’s object storage technology partner. We’re happy for Amplidata and looking forward to expanded partnership opportunities with WDC and the HGST group. As a reminder, Quantum announced in 2012 that we were leveraging the performance and availability of Amplidata’s object storage technology by embedding it in our Lattus family of unique active archive solutions. Since that time, many Quantum customers have been able to increase the value of their data by extending cost-effective online access to massive volumes (PBs) of information, so let's look at how this announcement is great news for three major reasons.
I recently had the pleasure of participating in the launch of Quantum’s new cloud-related services and products – Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. I participated by joining some panel discussions with Quantum’s Geoff Stedman, Senior Vice President, StorNext Solutions, and Bassam Tabbara, Executive Director, Cloud Services. One of the over-riding themes in our panel discussion was the idea of putting data where it makes the most sense from a cost and performance perspective. If this sounds a lot like the good --well, not really – old days of hierarchical storage management (HSM), that’s because the basic concept is the same. One of the differences today is that we now have the cloud as another tier in the storage hierarchy.
As ubiquitous as the cloud is today, it has taken time to get there. Even in enterprise IT, you wouldn’t characterize the adoption of cloud services as universal and all-encompassing. Functions that seem obvious and mundane like cloud-based IT storage backup and cloud-based virtualized software development went through periods of careful ROI analysis. So it is with cloud for media and entertainment. If you feel like you’re behind the curve in leveraging cloud in your M&E operations, you’re not alone and it comes down to two factors: ecosystem and workflow.
Businesses of all size are increasingly starting deployments of cloud-based data, driven by the promises of greater agility, lower management cost and capital savings. It just makes sense. When both compute and data move together – in lockstep – to the cloud, the issues to be considered are very similar to deploying or migrating an onsite application. But when the major compute operations are staying onsite and only the data is moving offsite (such as for backup, disaster recovery or compliance archive), the deployment can be more complex. In this scenario, operational executives must consider five key issues to ensure a successful experience – including meeting customer service level agreements (SLAs) and staying within budget. Based on the use case you are planning, you must consider these 5 issues.
If you have an aging Apple Xsan environment or need a robust asset management environment that Cantemo Portal can deliver, you can’t miss this opportunity to chat informally with folks from all three companies over a beer. Come to the Goose Island Brewery in Chicago or the Cambridge Brewing Company in Boston to hear more about how 1303 Systems, Cantemo and Quantum can help improve collaborative media workflows and extend the value of your content.
The enterprise has been at the center of IT innovation for many decades. With an emphasis on reliability, business continuity, security and ROI, the enterprise has challenged software and hardware vendors to continuously innovate to meet such high demands. Today, we are announcing three more cloud offerings: Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. We believe that the industry is witnessing a transformational shift in enterprise IT, and we are excited to enable customers to easily and seamlessly add cloud storage to their existing storage and collaboration workflows.
As we get ready to say goodbye to 2014, our thoughts turn to what lies ahead in 2015. If it’s anything like this year, it will be another exciting – and interesting – one for storage. With that in mind, here are some of my thoughts on what’s in store for storage in the coming year.
Since launching the DXi6900 in July, we’ve seen remarkable customer interest. That interest is one of the key drivers of the 11% year-over-year growth in DXi revenue we reported last quarter. Now another data point showing how the DXi6900 and the entire DXi family stacks up against other deduplicating backup appliances has just been published: Industry analyst DCIG issued their annual buyer’s guide and the DXi6900 earned a “recommended” rating, with just .45 points separating it from the top spot. In fact, DXi appliances took three of the top 6 spots in the list. So why is the DXi6900 getting so much attention? And how do DXi's deduplication solutions work for "real world" customers?