As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.
We’re really excited for VMworld this week - once again one of the best opportunities to showcase our technologies and the ways that we are helping customers rethink how they protect VMware data, as well as other data types. At VMworld, we’ll be talking about how “Backup is Busted.” What does that mean? Backup is Busted means that batch backup is not designed for protecting virtual servers, or other specialized, challenging data like large unstructured file content. When it comes to virtual servers and VMware data, we’ll be showcasing our vmPRO and DXi-V deduplication technology, which uses a unique approach to protecting VM’s. Rather than the traditional, network-and-resource intensive approach of doing batch backups in proprietary formats, vmPRO snaps VMs in native format, first prepping the VMs for maximum data reduction and network savings. vmPRO uses the native tools available in VMware, instead of treating VM’s like physical servers.
Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.
In Archive Storage, Deduplication, Media & Entertainment, Scale-out Storage, VM Backup & Archive, Workflow StoragePosted
Why do companies continue to store that data on their most expensive, highest performance storage? A better approach – and a way companies can completely rethink their backup and archive approach to unstructured data – is to employ a tiered storage approach. Quantum is a specialist in designing tiered storage solutions for unstructured data – we’ve been doing it for years in the most demanding data environments like M&E, Government, and Oil and Gas. So we can use some of our core technologies and our core approach, to design tiered storage solutions for the data center.
How do they go so fast? We are continually battling competitors stating ingest performance using numbers that defy logic. That is, we compete against systems that have four x 10GbE ports that supposedly ingest at 100TB/hour. The following example is not to create debate about specific mathematical accuracy but to educate you on how they are “cooking the books”.
IT departments today are rapidly deploying virtualization technologies – in fact, Gartner reports that server virtualization is already over 60% penetrated and projects the market will be over 80% penetrated by 2016. With this rapid rise in virtualization deployment, customers are challenged to incorporate data protection and archive methodologies for their virtualized data. ESG recently indicated that 60% of virtualization technology users are planning to address challenges associated with data protection for their virtualized data as a top priority for 2013, an astonishing number. There are certainly lots of options available to protect traditional data, but virtualized data is a different beast. Another Gartner report identifies that over 1/3 of organizations will change backup vendors due to factors including cost, complexity, or capability. Based on what I’ve heard talking with customers, I would add one more factor: compatibility. In this article we will explore all four of these areas and suggest ways to overcome the challenges associated with virtualized data protection.
The convergence of backup and archive is a really hot topic right now. Quantum, as well as some of our partners in the industry, are introducing some great capabilities that are really bringing together backup and archive. An extreme view on this topic is that batch backup is a thing of the past, and for some data types and use cases there is some truth to that. But to unpack this a bit we need to look at use cases and even specific data types.
In my first two blogs, the discussions have been around using native format for amazing fast restores and for booting VMs remotely, all without actually needing the backup application. In this final post I will explore how native format can provide “future proof flexibility.” When I hear this term, I am always reminded of the Best Buy commercial where consumers are constantly, and humorously, reminded of the fast pace of technology. If you are like me, having something new and shiny is always nice, however, life gets in the way and we need to spend our finances elsewhere. The same goes for our IT budgets.
If you had the chance to read my first blog in this series, you will remember that vmPRO is the only backup application that writes data in native format, which can guarantee fast restores using vmPRO GUI or a standard file browser with easy drag and drop functionality. In this post, the second of this three-part series, I will cover how the native backup format can help transform your disaster recover (DR) strategy by booting VMs and restoring files remotely (no matter the target device — disk, tape or cloud) all without actually using the backup application.
I love talking to customers about their VM backup challenges. One of the most exciting topics for me is the native file format capability of Quantum’s vmPRO. In this three part series, I’ll talk about three reasons you’re going to love backing up your VMs in native file format. While proprietary formats have been the norm in the storage industry, as technologies and data access requirements are evolving, customers no longer need to settle for this traditional method. Think about how native format works when it comes to simple data transfer and storage. For example, when you want to protect your precious digital pictures and store them on a thumb drive or USB hard drive, you simply drag and drop the files to the device. When you want to restore them, you simply plug the device into your computer and drag and drop the files back to your system. vmPRO takes that simplified concept and applies it to the business-class level with virtual data. Why over-complicate things?
I was talking to an SMB customer this week, and he was raving about the benefits of deduplication. Deploying deduplication allowed him to completely rethink his data protection policies and reduce his management time. He’s in a smaller IT department where, as he said, “we have to wear many hats and don’t have the luxury of in-house specialists. It was so simple, no worries about backup windows, or any of those details that we used to regularly consider and tweak with the older backup system.” His statement got me thinking, the same type of change is happening with VM data protection. As noted in this previous blog from TeamQ, VM deployments are growing rapidly, but protecting VM data creates some unique challenges.
If you have questions about data protection for virtual machines, we invite you to download “VM Data Protection for Dummies, Quantum Special Edition.” This free e-book will help you identify your VM protection needs, ask the right questions of potential vendors, and choose the VM protection solution that’s right for your organization. The FAQ chapter covers 10 frequently asked questions about VM data protection, such as “How does VM backup work with deduplication appliances?” and “Do I have to choose between one of the new VM-only backup applications and my legacy backup software?”