Las Vegas is the home for what’s going to be one of the biggest hotspot for determining the future of IT and data protection. VeeamOn 2015 is almost here. Quantum and Veeam recently cemented our Premier Partnership and we’re excited to talk about how we’re helping customers break free from old school, traditional ways of protecting data. That’s what Kicking The Cartel is all about. Storage companies will be all over VeeamOn, competing for your attention, showing off their latest and greatest in an attempt to be one step ahead of the rapid change in the way companies store data. And we all know the way people access data is changing – with access patterns and virtual workloads becoming less and less predictable. Customers need to have greater access to their data, and they don’t want to pay thousands of dollars per TB for a ‘one size fits all’ to protect it.
Storing data is easy (well, not that easy). But turning it into meaningful business value requires technology partners that know how to integrate well and create something bigger. Customers want products that can create a real solution to a real problem. That’s the Quantum approach and that’s why we’re so excited for this year’s VMworld. We’re “Kicking the Cartel” and helping companies break free from traditional, old school ways of storing and protecting data. If it’s time for your organization to break free from paying $2,500 per TB for backup software, stop by the Quantum booth at VMworld. We’ll show you how to create a different approach to storage that centers on the highest performance and the lowest TCO. We’ll be showcasing all of our technology at VMworld too, including our QXS hybrid storage for VM primary storage and our Artico NAS appliance for archiving. Sign up to meet with the Quantum Storage Experts at the show and you could win an Apple Watch too.
Finally – a day to celebrate everything about backup! We know that backups can sometimes be seen as a necessarily evil, but let’s face it – backing up and protecting data is more important than ever. Both for consumers at a personal level, but also for businesses, digital information is more valuable than ever. And the nature of backup is changing – the idea of ‘batch backup’ that businesses have employed for years is going away, with new technologies and approaches for storing and protecting data coming out all the time. It’s an interesting opportunity to reflect on where Quantum has been in the world of backup as a leader in various technologies, and now where we’re going as a storage company.
Quantum's DXi6900 proudly received the Silver Award in Storage Magazine/SearchStorage.com's 2014 "Product of the Year Backup Hardware" category. This industry recognition comes just as industry analyst ESG released the results of their recent lab testing, validating DXi6900’s performance claims. Both the award and lab validation are a big deal for us and reinforce what we’ve already been hearing from our happy customers: DXi6900 is a high performance appliance that is very well suited to the needs of mid-size and enterprise-scale companies as well as managed service providers.
As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.
Chances are, if you are having backup problems, your issue is caused by large unstructured data. Among that mass of unstructured data that is making backup difficult, you're likely to find an increasing amount of data that never changes. Industry analysts have observed that by the year 2020, a full 50% of the data passing over the network (and stored somewhere) is going to be video and images. So the question becomes: Are you ready to look at the composition of your unstructured data to see what data has snuck in? And find an easy archiving software you can install to migrate this data OUT of your active data pool and out of your backup data process?
We’re really excited for VMworld this week - once again one of the best opportunities to showcase our technologies and the ways that we are helping customers rethink how they protect VMware data, as well as other data types. At VMworld, we’ll be talking about how “Backup is Busted.” What does that mean? Backup is Busted means that batch backup is not designed for protecting virtual servers, or other specialized, challenging data like large unstructured file content. When it comes to virtual servers and VMware data, we’ll be showcasing our vmPRO and DXi-V deduplication technology, which uses a unique approach to protecting VM’s. Rather than the traditional, network-and-resource intensive approach of doing batch backups in proprietary formats, vmPRO snaps VMs in native format, first prepping the VMs for maximum data reduction and network savings. vmPRO uses the native tools available in VMware, instead of treating VM’s like physical servers.
Deduplication is now widely recognized as a proven technology in the datacenter. In fact, it seems to be cropping up everywhere – from flash arrays to backup applications, and of course disk backup appliances. There’s no end in sight for structured and unstructured data growth, and with the proliferation of technology like deduplication, it is not unusual that the level of complexity increases, as does the challenge to keep it in check. A good first step is to recognize where deduplication can best be applied. Here are a few of the considerations.
Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.
Data protection strategies have been in a state of accelerated evolution over the last five years. I hear this confirmed regularly by customers describing their implementation stories with Quantum, as well as by the industry analysts that we meet with to discuss our latest product innovations. ESG’s Jason Buffington is one of those analysts that we talk with often, and it’s always interesting to see how ESG’s research squares against what we’re seeing in data centers. Jason’s latest video blog about modernizing data protection – 8 Suggestions for Every Data Protection Strategy – highlights ESG research that resonated with me in a number of respects.
This article originally appeared on Wired Magazine’s Innovation Insights. With the start of the new year, it’s time once again for those of us in enterprise storage to look ahead and offer our predictions for what the industry will see in 2014. So without further ado, here are ten trends that will have a big impact in the coming year.
As the volume of data has increased, there has been a shift in the way that companies use and access that data. That means it’s time to change the way you think about data protection, retention, and accessibility. Organizations of all sizes recognize that data can help gain competitive advantages and even support new revenue streams, but this is placing a demand on IT to store and preserve access to that data. Companies need new solutions and technologies to support unpredictable, on-demand access and incorporate new approaches to backup and archiving. It’s time to reTHINK Backup & Archive.
As my colleague Terry Grulke pointed out earlier, there is lot of funny math used by deduplication vendors to try to convince you that their system can go fast. With our DXi systems we don’t have to hire Cirque de Soleil to generate our performance numbers. We can keep it simple because DXi systems are just really, really fast – natively. That’s what I’m going to talk here about here – “Native” performance. That is, the capability of the DXi system itself vs. some manufactured “logical” number like the ones Terry wrote about. Apparently, our high performance is confusing to some of our competitors.